I need to cache multiple types like:
public Country findCountry(String countryName)
and:
public List<Destination> findAllDestinations(String countryName)
I am using RedisCacheManager and RedisTemplate only support only one serializer.
It is solved now after some research.
change spring-data-redis to 1.4.2.RELEASE
extend RedisCacheManager with your class with cache map to serializer (cacheName->serializer) and caches names
overrides the getCache method(Cache getCache(String name)) and based on cache name, set the serializer name in the redis template
use your customized cache manager
Example -
public class CustomRedisCacheManager extends RedisCacheManager
{
public static final String CACHE_NAME_DEFAULT = "default";
public static final String CACHE_NAME_COUNTRY = "country";
public static final String CACHE_NAME_DESTINATIONS = "destinations";
private Map<String, RedisCache> redisCaches = new HashMap<>();
public CustomRedisCacheManager(Map<String, RedisTemplate> redisTemplates)
{
super(redisTemplates.get(CACHE_NAME_DEFAULT), redisTemplates.keySet());
redisTemplates.keySet().stream().forEach(cacheName -> redisCaches.put(cacheName, new RedisCache(cacheName, null, redisTemplates.get(cacheName), 0)));
}
#Override
public Cache getCache(String cacheName)
{
return redisCaches.get(cacheName);
}
}
#Configuration
#EnableCaching
public class RedisConfiguration extends CachingConfigurerSupport
{
#Bean
public JedisConnectionFactory jedisConnectionFactory()
{
JedisConnectionFactory factory = new JedisConnectionFactory();
factory.setHostName(redisHostName);
factory.setPort(redisPort);
factory.setTimeout(100);
return factory;
}
#Bean
public CacheManager cacheManager()
{
Map<String, RedisTemplate> templates = new HashMap<>();
templates.put(CACHE_NAME_DEFAULT, getDefaultRedisTemplate());
templates.put(CACHE_NAME_COUNTRY, getMetadataRedisTemplate());
templates.put(CACHE_NAME_DESTINATIONS, getDestinationsRedisTemplate());
SabreRedisCacheManager sabreRedisCacheManager = new SabreRedisCacheManager(templates);
return sabreRedisCacheManager;
}
#Bean
public RedisTemplate<Object, Object> getDefaultRedisTemplate()
{
return getBaseRedisTemplate();
}
#Bean
public RedisTemplate<Object, Object> getCountryRedisTemplate()
{
RedisTemplate<Object, Object> redisTemplate = getBaseRedisTemplate();
redisTemplate.setValueSerializer(jsonRedisSerializer(Country.class));
return redisTemplate;
}
#Bean
public RedisTemplate<Object, Object> getDestinationsRedisTemplate()
{
RedisTemplate<Object, Object> redisTemplate = getBaseRedisTemplate();
redisTemplate.setValueSerializer(jsonRedisSerializer(TypeFactory.defaultInstance().constructCollectionType(List.class, Destination.class)));
return redisTemplate;
}
private RedisTemplate<Object, Object> getBaseRedisTemplate()
{
RedisTemplate<Object, Object> redisTemplate = new RedisTemplate<>();
redisTemplate.setConnectionFactory(jedisConnectionFactory());
redisTemplate.setKeySerializer(stringRedisSerializer());
redisTemplate.setHashKeySerializer(stringRedisSerializer());
redisTemplate.setValueSerializer(jsonRedisSerializer(Object.class));
return redisTemplate;
}
private Jackson2JsonRedisSerializer jsonRedisSerializer(Class type)
{
return jsonRedisSerializer(TypeFactory.defaultInstance().constructType(type));
}
private Jackson2JsonRedisSerializer jsonRedisSerializer(JavaType javaType)
{
Jackson2JsonRedisSerializer jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(javaType);
jackson2JsonRedisSerializer.setObjectMapper(new JsonObjectMapper());
return jackson2JsonRedisSerializer;
}
}
Related
I am trying to use hash operations of Redis from an async (performTask) method, but I keep getting this exception,
java.lang.IllegalStateException: JedisConnectionFactory was destroyed and cannot be used anymore
this async method is called from another service class in a loop like shown in the below code snippet of service class,
Service class:
....
for(CacheJob job : jobs.getJobs()) {
if(job.getCacheKey() != null && job.getLupdTSSQL() != null) {
asyncService.performTask(job.getCacheKey(), job.getLupdTSSQL());
}
}
....
Async service class:
#Autowired
private HashOperations<String, String, String> lupdTSHashOperations;
#Async
public void performTask(String cacheKey, String sql) throws Exception {
String lupdTSFromCacheStr = null;
System.out.println("Execute method asynchronously. Thread Id"
+ Thread.currentThread().getId()+": cacheKey :"+cacheKey);
....
String lupdTSCacheKey = null;
...
try {
lupdTSCacheKey = cacheKey + Constants.SUFFIX_LUPDTS;
lupdTSFromCacheStr = lupdTSHashOperations.get(cacheKey, lupdTSCacheKey);
....
.....
AsyncConfig
#Configuration
#EnableAsync
public class MyAsyncConfig implements AsyncConfigurer{
#Bean
public Executor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(8);
executor.setMaxPoolSize(8);
executor.setQueueCapacity(100);
executor.setThreadNamePrefix("my-cache-jobs-");
executor.setWaitForTasksToCompleteOnShutdown(true);
executor.initialize();
return executor;
}
.....
Redis config
#Configuration
public class MyCacheRedisConfig {
#Value("${spring.redis.host}")
private String redisHost;
#Value("${spring.redis.port}")
private String redisPort;
#Value("${spring.redis.password}")
private String redisPassword;
#SuppressWarnings("deprecation")
#Bean(name = "jedisConnectionFactory")
JedisConnectionFactory jedisConnectionFactory() {
JedisPoolConfig poolConfig = new JedisPoolConfig();
poolConfig.setMaxTotal(8);
poolConfig.setMaxIdle(8);
poolConfig.setMinIdle(0);
JedisConnectionFactory rsc = new JedisConnectionFactory();
rsc.setPoolConfig(poolConfig);
rsc.setHostName(redisHost);
rsc.setPort(Integer.parseInt(redisPort));
rsc.setPassword(redisPassword);
rsc.setUseSsl(true);
return rsc;
}
.....
....
#Bean(name = "lupdTSRedisTemplate")
public RedisTemplate<String, String> lupdTSRedisTemplate() {
final RedisTemplate<String, String> template = new RedisTemplate<String, String>();
template.setConnectionFactory(jedisConnectionFactory());
template.setKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new StringRedisSerializer());
template.setHashKeySerializer(new StringRedisSerializer());
template.setHashValueSerializer(new StringRedisSerializer());
template.afterPropertiesSet();
return template;
}
#Bean(name="lupdTSHashOperations")
public HashOperations<String, String, String> lupdTSHashOperations(
#Qualifier("lupdTSRedisTemplate") RedisTemplate<String, String> lupdTSRedisTemplate) {
return lupdTSRedisTemplate.opsForHash();
}
this exception is only occurring when I try to use the Redis hash operations from async method
I am trying to use spring cache abstraction with Redis cache. I am unable to see the values in cache. Please help me if I am missing something in config :
As I am making the call multiple times actual fetch is happening. I tried connecting to same redis host port, I cant find there any keys as well.
PFB the implementation details.
CacheUtils.java :
#Slf4j
public class CacheUtils {
private final CustomerManagementClient customerManagementClient;
#Autowired
public CacheUtils(CustomerManagementClient customerManagementClient) {
this.customerManagementClient = customerManagementClient;
}
#Cacheable(value = "merchant-details", key = "#merchantEntityId")
public MerchantDetails getOrFetchMerchantDetails(OrderItemStatusChangeEvent event, MerchantType merchantType, String merchantEntityId) {
if (BUYER == merchantType) {
log.info("test - get buyer details");
CustomerDetails customerDetails =
customerManagementClient.getData(merchantEntityId);
String businessId = customerDetails.getBusinessId();
String phoneNumber = customerDetails.getPhoneNumber();
return MerchantDetails
.builder()
.merchantEntityId(merchantEntityId)
.businessId(businessId)
.businessName(customerDetails.getBusinessName())
.merchantType(merchantType)
.contactNumber(phoneNumber)
.build();
}
throw new InvalidInputException();
}
}
MainClass.java
#Slf4j
#Component
public class MainClass implements LogisticsPlanningService {
private final CacheUtils cacheUtils;
#Autowired
public LogisticsPlanningServiceImpl(CacheUtils cacheUtils) {
this.cacheUtils = cacheUtils;
}
private Set<LogisticsPlanningRequest> testMethod(Event event) {
MerchantDetails senderDetails = cacheUtils.getOrFetchMerchantDetails(event, SELLER, orderItem.getSellerId());
MerchantDetails receiverDetails = cacheUtils.getOrFetchMerchantDetails(event, BUYER, orderItem.getBuyerId());
}
}
RedisConfiguration.java
#Configuration
#EnableCaching
public class RedisConfiguration {
private String hostName;
private int port;
#Autowired
MarketPlaceServiceProperties properties;
#PostConstruct
public void init() {
hostName = properties.getRedisHostName();
port = Integer.parseInt(properties.getRedisPort());
}
#Bean
protected JedisConnectionFactory jedisConnectionFactory() {
RedisStandaloneConfiguration configuration = new RedisStandaloneConfiguration(hostName, port);
JedisConnectionFactory factory = new JedisConnectionFactory(configuration);
factory.afterPropertiesSet();
return factory;
}
public RedisCacheConfiguration getTestCacheConfig() {
RedisCacheConfiguration cacheConfiguration = RedisCacheConfiguration.defaultCacheConfig();
cacheConfiguration.prefixCacheNameWith("marketplace");
cacheConfiguration.disableCachingNullValues();
return cacheConfiguration;
}
// #Bean
// public RedisTemplate<String, Object> redisTemplate() {
// final RedisTemplate<String, Object> redisTemplate = new RedisTemplate<String, Object>();
// redisTemplate.setKeySerializer(new StringRedisSerializer());
// redisTemplate.setHashKeySerializer(new GenericToStringSerializer<>(Object.class));
// redisTemplate.setHashValueSerializer(new JdkSerializationRedisSerializer());
// redisTemplate.setValueSerializer(new JdkSerializationRedisSerializer());
// redisTemplate.setConnectionFactory(jedisConnectionFactory());
// return redisTemplate;
// }
}
service.properties :
redisHostName: redis.domain.prod.xyz.com
redisPort: 5400
I have been trying to write a simple Kafka Listener unit test using KafkaEmbedded. However my listener does not gets invoked. I have been using this link for inspiration since I also need an Avro Serializer/DeSerializer.
Below are how my test class looks like.
#ExtendWith(SpringExtension.class)
public class KafkaTest{
public static final String TOPIC_2 = "topic";
#Autowired
private Service listener;
#Autowired
private SomeClient someClient;
#Autowired
private KafkaTemplate<String, Avro> template;
#Test
void testSimple() {
template.send(TOPIC_2, "test", Avro.newBuilder()
.setGameProvider("abcd")
.setMessageName("someMessageName")
.setRequestId(UUID.randomUUID().toString())
.setBody(UUID.randomUUID().toString())
.build());
verify(someClient).register(anyString(), anyString(), anyString(), anyString());
template.flush();
}
#Configuration
#EnableKafka
public static class Config {
#Bean
public EmbeddedKafkaBroker kafkaEmbedded() {
return new EmbeddedKafkaBroker(1, true, 1, TOPIC_2);
}
#Bean
public ConsumerFactory<String, Avro> createConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaEmbedded().getBrokersAsString());
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group-1");
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, CustomKafkaAvroDeserializer.class);
props.put("schema.registry.url", "not-used");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Avro> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Avro> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(createConsumerFactory());
return factory;
}
#MockBean
private SomeClient client;
#MockBean
private Marshaller marshaller;
#Bean
public SomeService listener() {
return new SomeService(client, marshaller, kafkaTemplate());
}
#Bean
public ProducerFactory<String, Avro> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaEmbedded().getBrokersAsString());
props.put(ProducerConfig.RETRIES_CONFIG, 1);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, CustomKafkaAvroSerializer.class);
props.put("schema.registry.url", "not-used");
return new DefaultKafkaProducerFactory<>(props);
}
#Bean
public KafkaTemplate<String, Avro> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
}
public class CustomKafkaAvroDeserializer extends KafkaAvroDeserializer {
#Override
public Object deserialize(String topic, byte[] bytes) {
if (topic.equals("topic")) {
this.schemaRegistry = getMockClient(SafeReport.SCHEMA$);
}
return super.deserialize(topic, bytes);
}
private static SchemaRegistryClient getMockClient(final Schema schema$) {
return new MockSchemaRegistryClient() {
#Override
public synchronized Schema getById(int id) {
return schema$;
}
};
}
}
public class CustomKafkaAvroSerializer extends KafkaAvroSerializer {
public CustomKafkaAvroSerializer() {
super();
super.schemaRegistry = new MockSchemaRegistryClient();
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client) {
super(new MockSchemaRegistryClient());
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client, Map<String, ?> props) {
super(new MockSchemaRegistryClient(), props);
}
}
#Service
#EnableBinding(Sink.class)
public class SomeService {
#KafkaListener(topics = "topic", groupId = "group-1")
public void listen(ConsumerRecord<String, Avro> cr) {
System.out.println(String.format("#### -> Consumed message -> %s", cr.toString()));
}
}
My listener is never invoked when I run this test.
I need to add a Redis cache in a method that returns a list of values.
I'm using this tutorial as basis https://www.baeldung.com/spring-data-redis-tutorial
The exception shows this
java.lang.ClassCastException: class java.lang.String cannot be cast to class java.util.List (java.lang.String and java.util.List are in module java.base of loader 'bootstrap')
at
#Cacheable(cacheNames = "customerDetailByParam", key="{#searchParams.toString()}")
#Retryable(value = { HttpServerErrorException.class }, maxAttempts = RETRY_ATTEMPTS, backoff = #Backoff(delay = 5000))
public List<ObjectResponse> searchCustomerDetailByParam(MultiValueMap<String, String> searchParams)
I've been looking for some solutions, however, none seems to work.
CacheConfig.java
#Configuration
#EnableCaching
#ConditionalOnMissingBean(value = CacheManager.class)
#Slf4j
public class CacheConfig {
#Bean
JedisConnectionFactory jedisConnectionFactory() {
RedisStandaloneConfiguration redisStandaloneConfiguration = new RedisStandaloneConfiguration("localhost", 6379);
//redisStandaloneConfiguration.setPassword(RedisPassword.of("yourRedisPasswordIfAny"));
return new JedisConnectionFactory(redisStandaloneConfiguration);
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
final RedisTemplate<String, Object> template = new RedisTemplate<String, Object>();
template.setConnectionFactory(jedisConnectionFactory());
template.setValueSerializer(new GenericToStringSerializer<Object>(Object.class));
template.setHashValueSerializer(new Jackson2JsonRedisSerializer<>(Object.class));
RedisSerializer<Object> serializer = new JdkSerializationRedisSerializer(getClass().getClassLoader());
template.setDefaultSerializer(serializer);
return template;
}
}
ObjectResponse.java
#Data
#NoArgsConstructor
#AllArgsConstructor
public class ObjectResponse implements Serializable {
#JsonProperty("id")
private String customerId;
#JsonProperty("name")
#JsonAlias("full_name")
private String customerName;
private String document;
private String email;
}
I was able to fix the problem changing the template to the following configuration.
#Bean
public RedisTemplate<String, Object> redisTemplate() {
final RedisTemplate<String, Object> template = new RedisTemplate<String, Object>();
template.setKeySerializer(new StringRedisSerializer());
template.setHashKeySerializer(new StringRedisSerializer());
template.setValueSerializer(new GenericJackson2JsonRedisSerializer());
template.setConnectionFactory(jedisConnectionFactory());
return template;
}
I have the below Autoconfiguration class for Kafka:
#Configuration
#EnableKafka
#ConditionalOnClass(KafkaReceiver.class)
#ConditionalOnProperty({"spring.kafka.bootstrap-servers"})
public class KafkaAutoConfiguration<T> { #Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
private KafkaProperties kafkaConfig;
private String groupId;
#Autowired
public void setKafkaProperties(KafkaProperties properties) {
this.kafkaConfig = properties;
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapServers);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
org.apache.kafka.common.serialization.ByteArrayDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfig.getGroupId());
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return props;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigs());
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaBatchListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setBatchListener(true);
return factory;
}
#Bean
public DefaultKafkaHeaderMapper headerMapper(){
return new DefaultKafkaHeaderMapper();
}
#Bean("simpleReceiver")
#ConditionalOnMissingBean(name = "simpleReceiver")
#ConditionalOnProperty({"service.kafka.consumer.topics"})
public KafkaReceiver simpleReceiver() {
return new KafkaSimpleReceiver();
}
#Bean("batchReceiver")
#ConditionalOnMissingBean(name = "batchReceiver")
#ConditionalOnProperty({"service.kafka.consumer.batch-topics"})
public KafkaReceiver batchReceiver() {
return new KafkaBatchReceiver();
}
}
Simple listener bean:
public class KafkaSimpleReceiver implements KafkaReceiver {
#KafkaListener(topics = "#{'${service.kafka.consumer.topics}'.split(',')}", containerFactory = "kafkaListenerContainerFactory")
public void receive(ConsumerRecord record, #Headers MessageHeaders headers) throws KafkaException {
}
}
Batch listener bean:
public class KafkaBatchReceiver implements KafkaReceiver {
#KafkaListener(topics = "#{'${service.kafka.consumer.batch-topics}'.split(',')}", containerFactory = "kafkaBatchListenerContainerFactory")
public void receive(List<ConsumerRecord> records, #Headers MessageHeaders headers) throws KafkaException {
}
}
Simple listener is working fine, but I am getting the following error for batch listener. How can we access MessageHeaders in this case?
A parameter of type 'List<ConsumerRecord>' must be the only parameter (except for an optional 'Acknowledgment' and/or 'Consumer')
EDIT
This is what I did to convert ConsumerRecord to MessageHeaders
public void receive(List<ConsumerRecord> records) {
for(ConsumerRecord record : records) {
Map<String, Object> headersList = new HashMap<>();
for(final Header h : record.headers()) {
headersList.put(h.key(), new String(h.value()));
}
MessageHeaders headers = new MessageHeaders(headersList);
}
}
You can get a list of Message<?>.
#KafkaListener(topics = "so54086076", id = "so54086076")
public void listen(List<Message<?>> records) {
System.out.println(records.size() + ":" + records);
}
The message payloads will be the ConsumerRecord.value(); the other ConsumerRecord properties will be in the headers.