Caffeine cache refresh / reload cache manually or on demand - caching

I have implemented caffeine cache in my application. I am caching data from few static tables. But i want to know if i can refresh / clear / reload cache manually or on demand using a REST API or any other way.
Can any one please suggest a way to implement such a requirement.
I want something like :-
an endpoint url like :- http://localhost:8080/refreshCache
this will trigger some method internally and clear the cache or reload new values in cache manually.
Below is the cache configuration:
#Configuration
public class CacheConfig{
private com.github.benmanes.caffeine.cache.Cache<Object, Object> cache;
#Bean
Caffeine<Object,Object> cacheBuilder(){
return Caffeine.newBuilder()
.initialCapacity(300)
.maximumSize(50000)
.expireAfterAccess(1, TimeUnit.DAYS)
.removalListener(new CacheRemovalListener())
.recordStats();
}
class CacheRemovalListener implements RemovalListener<Object, Object> {
#Override
public void onRemoval(Object key, Object value, RemovalCause cause) {
System.out.format("Removal listener called with key [%s], cause[%s], evicted [%s] %n",
key , cause.toString(), cause.wasEvicted());
}
}
}

You can use Spring's CacheManager to create CaffeineCache instances and then you can perform CRUD operations on any cache using CacheManager.
See Below code.
Bean Configuration:
public class CacheBeansConfig {
#Bean
public CacheManager cacheManager() {
// create multiple instances of cache
CaffeineCacheManager cacheManager = new CaffeineCacheManager("UserCache","InventoryCache");
cacheManager.setCaffeine(caffeineCacheBuilder());
return cacheManager;
}
private Caffeine<Object, Object> caffeineCacheBuilder() {
return Caffeine.newBuilder()
.initialCapacity(<initial capacity>)
.maximumSize(<max size>)
.expireAfterAccess(<expire after hrs>, TimeUnit.HOURS)
.recordStats();
}
This will initialize your CacheManager with two Caffeeine Cache instances.
Use below Rest Controller Class to access these class.
#RestController
#RequestMapping(path = "/v1/admin/cache")
public class ACSCacheAdminController {
#Autowired
private CacheManager cacheManager;
/**
* call this to invalidate all cache instances
*/
#DeleteMapping(
path = "/",
produces = {"application/json"})
public void invalidateAll() {
Collection<String> cacheNames = cacheManager.getCacheNames();
cacheNames.forEach(this::getCacheAndClear);
}
/**
* call this to invalidate a given cache name
*/
#DeleteMapping(
path = "/{cacheName}",
produces = {"application/json"})
public void invalidateCache(#PathVariable("cacheName") final String cacheName) {
getCacheAndClear(cacheName);
}
/**
* Use this to refresh a cache instance
*/
#PostMapping(
path = "/{cacheName}",
produces = {"application/json"})
public void invalidateCache(#PathVariable("cacheName") final String cacheName) {
getCacheAndClear(cacheName);
Cache cache = cacheManager.getCache(cacheName);
// your logic to put in above cache instance
// use cache.put(key,value)
}
/**
* call this to invalidate cache entry by given cache name and cache key
*/
#DeleteMapping(
path = "/{cacheName}/{key}/",
produces = {"application/json"})
public void invalidateCacheKey(
#PathVariable("cacheName") final String cacheName, #PathVariable("key") Object key) {
final Cache cache = cacheManager.getCache(cacheName);
if (cache == null) {
throw new IllegalArgumentException("invalid cache name for key invalidation: " + cacheName);
}
cache.evict(key);
}
#GetMapping(
path = "/{cacheName}/{key}",
produces = {"application/json"})
public ResponseEntity<Object> getByCacheNameAndKey(
#PathVariable("cacheName") final String cacheName, #PathVariable("key") final int key) {
final Cache cache = cacheManager.getCache(cacheName);
if (cache == null) {
throw new IllegalArgumentException("invalid cache name: " + cacheName);
}
return ResponseEntity.ok().body(cache.get(key));
}
private void getCacheAndClear(final String cacheName) {
final Cache cache = cacheManager.getCache(cacheName);
if (cache == null) {
throw new IllegalArgumentException("invalid cache name: " + cacheName);
}
cache.clear();
}
Just change the code as per your need :)

Related

Spring boot cache - manual invalidation and automatic reload

I have a question about caching in Spring Boot. I created my own cache service and cache manager. I will have to manually invalidate this cache and reload it when user performs some actions in frontend.
I would be now able to put elements to such cache. But I don't know two things:
How can I put All elements from some collection to cache without iterating such collection, there is no putAll method on cache ?
It possible to invalidate cache by using clear method which I implemented below, but what then? How this cache would reload - I have to load the cache manually by myself ? Couldn't it reload itself automatically imediatelly after I invalidate it ?
If none of above is possible, in my scenario, whats the point of using cache if simple hashmap would do the same ..
Cache manager
#Configuration
#EnableCaching
public class MyCache {
#Bean
public CacheManager cacheManager() {
return new SimpleCacheManager();
}
}
Cache service
#Component
#RequiredArgsConstructor
public class CachingService {
private final CacheManager cacheManager;
public void putToCache(String cacheName, String key, String value) {
Optional.ofNullable(cacheManager.getCache(cacheName)).ifPresentOrElse(c -> c.put(key, value),
() -> {
throw new IllegalArgumentException("Cache with name %s does not exist");
});
}
public void putAllToCache(String cacheName, Map<String, String> keyValueMap) {
Optional.ofNullable(cacheManager.getCache(cacheName)).ifPresentOrElse(c -> c.put(key, value),
//TODO do I have to iterate all keyValueMap entries ?? there is no putAll ?
() -> {
throw new IllegalArgumentException("Cache with name %s does not exist");
});
}
public String getFromCache(String cacheName, String key) {
return Optional.ofNullable(cacheManager.getCache(cacheName)).map(c -> c.get(key)).map(Object::toString)
.orElse(null);
}
public void evictSingleCacheValue(String cacheName, String cacheKey) {
cacheManager.getCache(cacheName).evict(cacheKey);
}
public void evictAllCacheValues(String cacheName) {
cacheManager.getCache(cacheName).clear();
}
public void evictAllCaches() {
cacheManager.getCacheNames()
.parallelStream()
.forEach(cacheName -> cacheManager.getCache(cacheName).clear());
}
}

more than one 'primary' service instance suppliers found during load balancing (spring boot/cloud)

I'm currently updating from Spring boot 2.2.x to 2.6.x + legacy code, it's a big jump so there were multiple changes. I'm now running into a problem with load balancing through an api-gateway. I'll apologize in advance for the wall of code to come. I will put the point of failure at the bottom.
When I send in an API request, I get the following error:
more than one 'primary' bean found among candidates: [zookeeperDiscoveryClientServiceInstanceListSupplier, serviceInstanceListSupplier, retryAwareDiscoveryClientServiceInstanceListSupplier]
it seems that the zookeeperDiscovery and retryAware suppliers are loaded through the default serviceInsatnceListSupplier, which has #Primary over it. I thought would take precedence over the other ones. I assume I must be doing something wrong due changes in the newer version, here are the relevant code in question:
#Configuration
#LoadBalancerClients(defaultConfiguration = ClientConfiguration.class)
public class WebClientConfiguration {
#Bean
#Qualifier("microserviceWebClient")
#ConditionalOnMissingBean(name = "microserviceWebClient")
public WebClient microserviceWebClient(#Qualifier("microserviceWebClientBuilder") WebClient.Builder builder) {
return builder.build();
}
#Bean
#Qualifier("microserviceWebClientBuilder")
#ConditionalOnMissingBean(name = "microserviceWebClientBuilder")
#LoadBalanced
public WebClient.Builder microserviceWebClientBuilder() {
return WebClient.builder();
}
#Bean
#Primary
public ReactorLoadBalancerExchangeFilterFunction reactorLoadBalancerExchangeFilterFunction(
ReactiveLoadBalancer.Factory<ServiceInstance> loadBalancerFactory) {
//the transformer is currently null, there wasn't a transformer before the upgrade
return new CustomExchangeFilterFunction(loadBalancerFactory, transformer);
}
}
There are also some Feign Client related configs here which I will omit, since it's not (or shouldn't be) playing a role in this problem:
public class ClientConfiguration {
/**
* The property key within the feign clients configuration context for the feign client name.
*/
public static final String FEIGN_CLIENT_NAME_PROPERTY = "feign.client.name";
public ClientConfiguration() {
}
//Creates a new BiPredicate for shouldClose. This will be used to determine if HTTP Connections should be automatically closed or not.
#Bean
#ConditionalOnMissingBean
public BiPredicate<Response, Type> shouldClose() {
return (Response response, Type type) -> {
if(type instanceof Class) {
Class<?> currentClass = (Class<?>) type;
return (null == AnnotationUtils.getAnnotation(currentClass, EnableResponseStream.class));
}
return true;
};
}
//Creates a Custom Decoder
#Bean
public Decoder createCustomDecoder(
ObjectFactory<HttpMessageConverters> converters, BiPredicate<Response, Type> shouldClose
) {
return new CustomDecoder(converters, shouldClose);
}
#Bean
#Qualifier("loadBalancerName")
public String loadBalancerName(PropertyResolver propertyResolver) {
String name = propertyResolver.getProperty(FEIGN_CLIENT_NAME_PROPERTY);
if(StringUtils.hasText(name)) {
// we are in a feign context
return name;
}
// we are in a LoadBalancerClientFactory context
name = propertyResolver.getProperty(LoadBalancerClientFactory.PROPERTY_NAME);
Assert.notNull(name, "Could not find a load balancer name within the configuration context!");
return name;
}
#Bean
public ReactorServiceInstanceLoadBalancer reactorServiceInstanceLoadBalancer(
BeanFactory beanFactory, #Qualifier("loadBalancerName") String loadBalancerName
) {
return new CustomRoundRobinLoadBalancer(
beanFactory.getBeanProvider(ServiceInstanceListSupplier.class),
loadBalancerName
);
}
#Bean
#Primary
public ServiceInstanceListSupplier serviceInstanceListSupplier(
#Qualifier(
"filter"
) Predicate<ServiceInstance> filter, DiscoveryClient discoveryClient, Environment environment, #Qualifier(
"loadBalancerName"
) String loadBalancerName
) {
// add service name to environment if necessary
if(environment.getProperty(LoadBalancerClientFactory.PROPERTY_NAME) == null) {
StandardEnvironment wrapped = new StandardEnvironment();
if(environment instanceof ConfigurableEnvironment) {
((ConfigurableEnvironment) environment).getPropertySources()
.forEach(s -> wrapped.getPropertySources().addLast(s));
}
Map<String, Object> additionalProperties = new HashMap<>();
additionalProperties.put(LoadBalancerClientFactory.PROPERTY_NAME, loadBalancerName);
wrapped.getPropertySources().addLast(new MapPropertySource(loadBalancerName, additionalProperties));
environment = wrapped;
}
return new FilteringInstanceListSupplier(filter, discoveryClient, environment);
}
}
There was a change in the ExchangeFilter constructor, but as far as I can tell, it accepts that empty transformer,I don't know if it's supposed to:
public class CustomExchangeFilterFunction extends ReactorLoadBalancerExchangeFilterFunction {
private static final ThreadLocal<ClientRequest> REQUEST_HOLDER = new ThreadLocal<>();
//I think it's wrong but I don't know what to do here
private static List<LoadBalancerClientRequestTransformer> transformersList;
private final Factory<ServiceInstance> loadBalancerFactory;
public CustomExchangeFilterFunction (Factory<ServiceInstance> loadBalancerFactory) {
this(loadBalancerFactory);
///according to docs, but I don't know where and if I need to use this
#Bean
public LoadBalancerClientRequestTransformer transformer() {
return new LoadBalancerClientRequestTransformer() {
#Override
public ClientRequest transformRequest(ClientRequest request, ServiceInstance instance) {
return ClientRequest.from(request)
.header(instance.getInstanceId())
.build();
}
};
}
public CustomExchangeFilterFunction (Factory<ServiceInstance> loadBalancerFactory, List<LoadBalancerClientRequestTransformer> transformersList) {
super(loadBalancerFactory, transformersList); //the changed constructor
this.loadBalancerFactory = loadBalancerFactory;;
}
#Override
public Mono<ClientResponse> filter(ClientRequest request, ExchangeFunction next) {
// put the current request into the thread context - ugly, but couldn't find a better way to access the request within
// the choose method without reimplementing nearly everything
REQUEST_HOLDER.set(request);
try {
return super.filter(request, next);
} finally {
REQUEST_HOLDER.remove();
}
}
//used to be an override, but the function has changed
//code execution doesn't even get this far yet
protected Mono<Response<ServiceInstance>> choose(String serviceId) {
ReactiveLoadBalancer<ServiceInstance> loadBalancer = loadBalancerFactory.getInstance(serviceId);
if(loadBalancer == null) {
return Mono.just(new EmptyResponse());
}
ClientRequest request = REQUEST_HOLDER.get();
// this might be null, if the underlying implementation changed and this method is no longer executed in the same
// thread
// as the filter method
Assert.notNull(request, "request must not be null, underlying implementation seems to have changed");
return choose(loadBalancer, filter);
}
protected Mono<Response<ServiceInstance>> choose(
ReactiveLoadBalancer<ServiceInstance> loadBalancer,
Predicate<ServiceInstance> filter
) {
return Mono.from(loadBalancer.choose(new DefaultRequest<>(filter)));
}
}
There were pretty big changes in the CustomExchangeFilterFunction, but the current execution doesn't even get there. It fails here, in .getIfAvailable(...):
public class CustomRoundRobinLoadBalancer implements ReactorServiceInstanceLoadBalancer {
private static final int DEFAULT_SEED_POSITION = 1000;
private final ObjectProvider<ServiceInstanceListSupplier> serviceInstanceListSupplierProvider;
private final String serviceId;
private final int seedPosition;
private final AtomicInteger position;
private final Map<String, AtomicInteger> positionsForVersions = new HashMap<>();
public CustomRoundRobinLoadBalancer (
ObjectProvider<ServiceInstanceListSupplier> serviceInstanceListSupplierProvider,
String serviceId
) {
this(serviceInstanceListSupplierProvider, serviceId, new Random().nextInt(DEFAULT_SEED_POSITION));
}
public CustomRoundRobinLoadBalancer (
ObjectProvider<ServiceInstanceListSupplier> serviceInstanceListSupplierProvider,
String serviceId,
int seedPosition
) {
Assert.notNull(serviceInstanceListSupplierProvider, "serviceInstanceListSupplierProvider must not be null");
Assert.notNull(serviceId, "serviceId must not be null");
this.serviceInstanceListSupplierProvider = serviceInstanceListSupplierProvider;
this.serviceId = serviceId;
this.seedPosition = seedPosition;
this.position = new AtomicInteger(seedPosition);
}
#Override
// we have no choice but to use the raw type Request here, because this method overrides another one with this signature
public Mono<Response<ServiceInstance>> choose(#SuppressWarnings("rawtypes") Request request) {
//fails here!
ServiceInstanceListSupplier supplier = serviceInstanceListSupplierProvider
.getIfAvailable(NoopServiceInstanceListSupplier::new);
return supplier.get().next().map((List<ServiceInstance> instances) -> getInstanceResponse(instances, request));
}
}
Edit: after some deeper stacktracing, it seems that it does go into the CustomFilterFunction and invokes the constructor with super(loadBalancerFactory, transformer)
I found the problem or a workaround. I was using #LoadBalancerClients because I thought it would just set the same config for all clients that way (even if I technically only have one atm). I changed it to ##LoadBalancerClient and it suddenly worked. I don't quite understand why this made a difference but it did!

Store a String in Spring Cache and evict

I want to perform below Operations for Spring Cache.
check if passed String exists in Cache or not. If exists just return true, if not there then add to cache;
checkInCache(String str)
evict the String from Cache
evict(String str)
Tried like below
#Component
public class FlightCache {
public static final Logger log = LoggerFactory.getLogger(FlightCache.class);
#Autowired
CacheManager cacheManager;
public boolean isFlightKeyPresent(final String flightKey) {
final ValueWrapper existingValue = cacheManager.getCache("flightCache").get(flightKey);
log.info("existingValueexistingValue " + existingValue);
if (existingValue == null) {
cacheManager.getCache("flightCache").put(flightKey, flightKey);
return false;
} else {
return true;
}
}
and added #EnableCaching annotation on configuration class.
ERROR:
required a bean of type 'org.springframework.cache.CacheManager' that could not be found. The injection point has the following annotations: - #org.springframework.beans.factory.annotation.Autowired(required=true)Action:Consider defining a bean of type 'org.springframework.cache.CacheManager' in your configuration.
To check for cache contains key you can do this:
#Autowired
CacheManager cacheManager;
boolean isKeyPresent(Object key) {
cacheManager.getCache("MyCacheName").get(key) != null;
}
To evict key you can do this:
#Autowired
CacheManager cacheManager;
boolean cacheEvict(Object key) {
cacheManager.getCache("MyCacheName").evictIfPresent(key);
}

Spring Cache Abstraction with Hazelcast Doesn't Evict Key From Cache

With the following configuration, my return object is cached but when I try to evict a key manually it doesnt't work.
#Configuration
#EnableCaching
public class HazelCastConfiguration {
#Bean
public HazelcastCacheManager hazelcastCacheManager() {
return new HazelcastCacheManager(Hazelcast.newHazelcastInstance(hazelcastConfig()));
}
#Bean
public Config hazelcastConfig() {
return new Config()
.setInstanceName("hazelcast-instance")
.addMapConfig(new MapConfig()
.setName("myCache")
.setMaxSizeConfig(new MaxSizeConfig())
.setEvictionPolicy(EvictionPolicy.LRU)
.setStatisticsEnabled(true)
.setTimeToLiveSeconds(-1));
}
}
Cached method:
#Override
#Cacheable(value = "myCache", unless = "#result == null", key = "{#someString, #someLong, #someInteger}")
public List<MyReturnObject> methodWithCachedResults (String someString, Long someLong, Integer someInteger) {
//my logic
}
A sample helper method:
public void evictKey(String aString, Long aLong, Integer anInteger) {
IMap<Object, Object> hazelcastCache = Hazelcast.getHazelcastInstanceByName("hazelcast-instance").getMap("myCache");
hazelcastCache.evict(Arrays.asList(aString, aLong, anInteger));
logger.info("{}", hazelcastCache.keySet());
}
When I trigger the method above, it logs the key even though I force the key to be evicted.
The result is the same when I try with the CacheManager :
#Autowired
private HazelcastCacheManager cacheManager;
public void evictKey(String aString, Long aLong, Integer anInteger) {
cacheManager.getCache("myCache").evict(Arrays.asList(aString, aLong, anInteger));
}
However if I try this, it clears the whole cache which it obviously states:
public void evictKey(String aString, Long aLong, Integer anInteger) {
IMap<Object, Object> hazelcastCache = Hazelcast.getHazelcastInstanceByName("hazelcast-instance").getMap("myCache");
hazelcastCache.clear();
}
By the way, checking keySet().contains(Arrays.asList...) returns true.
It's far from obvious, but there's two implementations of List here.
#Cacheable will create an instance of java.util.ArrayList.
Arrays.asList will create an instance of java.util.Arrays.ArrayList.
This should make it clearer:
public void evictKey(String aString, Long aLong, Integer anInteger) {
IMap<Object, Object> hazelcastCache = Hazelcast.getHazelcastInstanceByName("hazelcast-instance").getMap("myCache");
java.util.List<Object> keyToEvict = Arrays.asList(aString, aLong, anInteger);
boolean success = hazelcastCache.evict(Arrays.asList(aString, aLong, anInteger));
logger.info("Evicted {}, {} == {}", keyToEvict, keyToEvict.getClass(), success);
for (Object key : hazelcastCache.keySet()) {
logger.info("Remaining key {}, {}", key, key.getClass());
}
}

Multiple Caffeine LoadingCaches added to Spring CaffeineCacheManager

I'm looking to add several distinct LoadingCache's to a Spring CacheManager, however I don't see how this is possible using CaffeineCacheManager. It appears that only a single loader is possible for refreshing content, however I need separate loaders for each cache. Is it possible to add multiple loading caches to a Spring cache manager? If so, then how?
CaffeineCacheManager cacheManage = new CaffeineCacheManager();
LoadingCache<String, Optional<Edition>> loadingCache1 =
Caffeine.newBuilder()
.maximumSize(150)
.refreshAfterWrite(5, TimeUnit.MINUTES)
.build(test -> this.testRepo.find(test));
LoadingCache<String, Optional<Edition>> loadingCache2 =
Caffeine.newBuilder()
.maximumSize(150)
.refreshAfterWrite(5, TimeUnit.MINUTES)
.build(test2 -> this.testRepo.find2(test2));
// How do I add to cache manager, and specify a name?
Yes it is possible. Since you need to fine tune every cache, you are probably better at defining them yourself. Back to your example, the next step would be:
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Arrays.asList(
new CaffeineCache("first", loadingCache1),
new CaffeineCache("second", loadingCache2)));
And then you can use that as usual, e.g.
#Cacheable("first")
public Foo load(String id) { ... }
If you are using Spring Boot, you can just expose the individual cache as beans (so org.springframework.cache.Cache implementations) and we'll detect them and create a SimpleCacheManager automatically for you.
Note that this strategy allows you to use the cache abstraction with different implementations. first could be a caffeine cache and second a cache from another provider.
Having this class will allow you to use #Cacheable("cacheA") where you want as normal:
#EnableCaching
#Configuration
public class CacheConfiguration {
#Bean
public CacheManager cacheManager() {
CaffeineCacheManager manager = new CaffeineCacheManager();
manager.registerCustomCache("cacheA", defaultCache());
manager.registerCustomCache("cacheB", bigCache());
manager.registerCustomCache("cacheC", longCache());
// to avoid dynamic caches and be sure each name is assigned to a specific config (dynamic = false)
// throws error when tries to use a new cache
manager.setCacheNames(Collections.emptyList());
return manager;
}
private static Cache<Object, Object> defaultCache() {
return Caffeine.newBuilder()
.maximumSize(1000)
.expireAfterWrite(5, TimeUnit.MINUTES)
.build();
}
private static Cache<Object, Object> bigCache() {
return Caffeine.newBuilder()
.maximumSize(5000)
.expireAfterWrite(5, TimeUnit.MINUTES)
.build();
}
private static Cache<Object, Object> longCache() {
return Caffeine.newBuilder()
.maximumSize(1000)
.expireAfterWrite(1, TimeUnit.HOURS)
.build();
}
}
Thanks for #rado, this is improved version of his answer. This way we can configure the cache from application properties directly
cache:
specs:
big-cache:
expire-after: WRITE
timeout: 2h
max-size: 1000
long-cache:
expire-after: ACCESS
timeout: 30d
max-size: 100
We need a cache properties for this
#Data
#EnableConfigurationProperties
#Configuration
#ConfigurationProperties(prefix = "cache")
public class CacheProperties {
private static final int DEFAULT_CACHE_SIZE = 100;
private Map<String, CacheSpec> specs = new HashMap<>();
#Data
public static class CacheSpec {
private Duration timeout;
private Integer maxSize = DEFAULT_CACHE_SIZE;
private ExpireAfter expireAfter = ExpireAfter.WRITE;
}
enum ExpireAfter { WRITE, ACCESS }
}
And then we can configure directly from external config file
#EnableCaching
#Configuration
#RequiredArgsConstructor
public class CacheConfiguration {
private final CacheProperties cacheProperties;
#Bean
public CacheManager cacheManager() {
CaffeineCacheManager manager = new CaffeineCacheManager();
Map<String, CacheProperties.CacheSpec> specs = cacheProperties.getSpecs();
specs.keySet().forEach(cacheName -> {
CacheProperties.CacheSpec spec = specs.get(cacheName);
manager.registerCustomCache(cacheName, buildCache(spec));
});
// to avoid dynamic caches and be sure each name is assigned
// throws error when tries to use a new cache
manager.setCacheNames(Collections.emptyList());
return manager;
}
private Cache<Object, Object> buildCache(CacheProperties.CacheSpec cacheSpec) {
if (cacheSpec.getExpireAfter() == CacheProperties.ExpireAfter.ACCESS) {
return Caffeine.newBuilder()
.expireAfterAccess(cacheSpec.getTimeout())
.build();
}
return Caffeine.newBuilder()
.expireAfterWrite(cacheSpec.getTimeout())
.build();
}
}
Now you can use the cache with using cache name
#Cacheable(cacheNames = "big-cache", key = "{#key}", unless="#result == null")
public Object findByKeyFromBigCache(String key) {
// create the required object and return
}
#Cacheable(cacheNames = "long-cache", key = "{#key}", unless="#result == null")
public Object findByKeyFromLongCache(String key) {
// create the required object and return
}

Resources