Spring custom annotation for caching - spring

I've used spring declarative annotation based caching approach. Here is how I've used it,
#Cacheable(value = "users", key = "T(org.mifosplatform.infrastructure.core.service.ThreadLocalContextUtil).getTenant().getName().concat(#username)")
public UserDetails loadUserByUsername(final String username) throws UsernameNotFoundException, DataAccessException { //method body }
This annotation looks very lengthy. I've tried to use a custom key-generator but the issue is when I define a key in the annotation,the custom key generator not get invoked.
So now I am trying to use a custom spring annotation as a workaround. But I was unable to find a good reference to start off. Basically I need to add a context aware parameter to the key (the tenant Identifier).
Any help will be greatly appreciated.

Key has to be a static variable. It cannot be a runtime evaluation.
You need to override the CacheManager and then do the modification. Below is an example where I extend HazelcastCacheManager which in turn extends spring's Cachemanager
public class MyCache extends HazelcastCacheManager {
private final ConcurrentMap<String, Cache> myCaches = new ConcurrentHashMap<String, Cache>();
public MyCache(){
super();
}
public MyCache(HazelcastInstance hazelcastInstance){
super(hazelcastInstance);
}
#Override
public Cache getCache(String name) {
String tenant = org.mifosplatform.infrastructure.core.service.ThreadLocalContextUtil).getTenant().getName();
Cache cache = myCaches.get(tenant.concat("#").concat(name));
if (cache == null) {
IMap<Object, Object> map = getHazelcastInstance().getMap(tenant.concat("#").concat(name));
cache = new HazelcastCache(map);
Cache currentCache = cesCaches.putIfAbsent(tenant.concat("#").concat(name), cache);
if (currentCache != null) {
cache = currentCache;
}
}
return (Cache)cache;
}

Related

Jackson #JsonFilter is not getting applied when used at field or method level

I am using Spring version 4.3.3 and Jackson version 2.8.3. I am trying to filter out specific fields from an entity bean based on some custom logic that is determined at runtime. The #JsonFilter seems ideal for this type of functionality. The problem is that when I put it at the field or method level, my custom filter never gets invoked. If I put it at the class level, it gets invoked just fine. I don't want to use it at the class level though since then I would need to separately maintain the list of hardcoded field names that I want to apply the logic to. As of Jackson 2.3, the ability to put this annotation at the field level is supposed to exist.
Here is the most basic custom filter without any custom logic yet:
public class MyFilter extends SimpleBeanPropertyFilter {
#Override
protected boolean include(BeanPropertyWriter beanPropertyWriter) {
return true;
}
#Override
protected boolean include(PropertyWriter propertyWriter) {
return true;
}
}
Then I have the Jackson ObjectMapper configuration:
public class MyObjectMapper extends ObjectMapper {
public MyObjectMapper () {
SimpleFilterProvider filterProvider = new SimpleFilterProvider();
filterProvider.addFilter("myFilter", new MyFilter());
setFilterProvider(filterProvider);
}
}
Then finally I have my entity bean:
#Entity
public class Project implements Serializable {
private Long id;
private Long version;
#JsonFilter("myFilter") private String name;
#JsonFilter("myFilter") private String description;
// getters and setters
}
If I move the #JsonFilter annotation to the class level where #Entity is, the filter at least gets invoked, but when it is at the field level like in the example here, it never gets invoked.
I have the same need but after examining the unit tests I discovered that this is not the use-case covered by annotating a field.
Annotating a field invokes a filter on the value of the field not the instance containing the field. For example, imagine you have to classes, A and B, where A contains a field of type B.
class A {
#JsonFilter("myFilter") B foo;
}
Jackson applies "myFilter" to the fields in B not in A. Since your example contains fields of type String, which has no fields, Jackson never invokes your filter.
I have a need to exclude certain fields based on the caller's permissions. For example, an employee's profile may contain his taxpayer id, which is considered sensitive information and should only be serialized if the caller is a member of the Payrole department. Since I'm using Spring Security, I wish to integrate Jackson with the current security context.
public class EmployeeProfile {
private String givenName;
private String surname;
private String emailAddress;
#VisibleWhen("hasRole('PayroleSpecialist')")
private String taxpayerId;
}
The most obvious way to do this is to Jackson's filter mechanism but it has a few limitations:
Jackson does not support nested filters so adding an access filter prohibits using filters for any other purpose.
One cannot add Jackson annotations to existing, third-party classes.
Jackson filters are not designed to be generic. The intent is to write a custom filter for each class you wish to apply filtering. For example, I you need to filter classes A and B, then you have to write an AFilter and a BFilter.
For my use-case, the solution is to use a custom annotation introspector in conjunction with a chaining filter.
public class VisibilityAnnotationIntrospector extends JacksonAnnotationIntrospector {
private static final long serialVersionUID = 1L;
#Override
public Object findFilterId(Annotated a) {
Object result = super.findFilterId(a);
if (null != result) return result;
// By always returning a value, we cause Jackson to query the filter provider.
// A more sophisticated solution will introspect the annotated class and only
// return a value if the class contains annotated properties.
return a instanceof AnnotatedClass ? VisibilityFilterProvider.FILTER_ID : null;
}
}
This is basically a copy SimpleBeanProvider that replaces calls to include with calls to isVisible. I'll probably update this to use a Java 8 BiPredicate to make the solution more general but works for now.
This class also takes another filter as an argument and will delegate to it the final decision on whether to serialize the field if the field is visible.
public class AuthorizationFilter extends SimpleBeanPropertyFilter {
private final PropertyFilter antecedent;
public AuthorizationFilter() {
this(null);
}
public AuthorizationFilter(final PropertyFilter filter) {
this.antecedent = null != filter ? filter : serializeAll();
}
#Deprecated
#Override
public void serializeAsField(Object bean, JsonGenerator jgen, SerializerProvider provider, BeanPropertyWriter writer) throws Exception {
if (isVisible(bean, writer)) {
this.antecedent.serializeAsField(bean, jgen, provider, writer);
} else if (!jgen.canOmitFields()) { // since 2.3
writer.serializeAsOmittedField(bean, jgen, provider);
}
}
#Override
public void serializeAsField(Object pojo, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer) throws Exception {
if (isVisible(pojo, writer)) {
this.antecedent.serializeAsField(pojo, jgen, provider, writer);
} else if (!jgen.canOmitFields()) { // since 2.3
writer.serializeAsOmittedField(pojo, jgen, provider);
}
}
#Override
public void serializeAsElement(Object elementValue, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer) throws Exception {
if (isVisible(elementValue, writer)) {
this.antecedent.serializeAsElement(elementValue, jgen, provider, writer);
}
}
private static boolean isVisible(Object pojo, PropertyWriter writer) {
// Code to determine if the field should be serialized.
}
}
I then add a custom filter provider to each instance of ObjectMapper.
#SuppressWarnings("deprecation")
public class VisibilityFilterProvider extends SimpleFilterProvider {
private static final long serialVersionUID = 1L;
static final String FILTER_ID = "dummy-filter-id";
#Override
public BeanPropertyFilter findFilter(Object filterId) {
return super.findFilter(filterId);
}
#Override
public PropertyFilter findPropertyFilter(Object filterId, Object valueToFilter) {
if (FILTER_ID.equals(filterId)) {
// This implies that the class did not have an explict filter annotation.
return new AuthorizationFilter(null);
}
// The class has an explicit filter annotation so delegate to it.
final PropertyFilter antecedent = super.findPropertyFilter(filterId, valueToFilter);
return new VisibilityPropertyFilter(antecedent);
}
}
Finally, I have a Jackson module that automatically registers the custom annotaion introspector so I don't have to add it to each ObjectMapper instance manually.
public class FieldVisibilityModule extends SimpleModule {
private static final long serialVersionUID = 1L;
public FieldVisibilityModule() {
super(PackageVersion.VERSION);
}
#Override
public void setupModule(Module.SetupContext context) {
super.setupModule(context);
// Append after other introspectors (instead of before) since
// explicit annotations should have precedence
context.appendAnnotationIntrospector(new VisibilityAnnotationIntrospector());
}
}
There are more improvements that can be made and I still have more unit tests to write (e.g., handling arrays and collections) but this is the basic strategy I used.
You can try this approach for the same purpose:
#Entity
#Inheritance(
strategy = InheritanceType.SINGLE_TABLE
)
#DiscriminatorColumn(
discriminatorType = DiscriminatorType.STRING,
length = 2
)
#Table(
name = "project"
)
#JsonTypeInfo(
use = Id.CLASS,
include = As.PROPERTY,
property = "#class"
)
#JsonSubTypes({
#Type(
value = BasicProject.class,
name = "basicProject"
),
#Type(
value = AdvanceProject.class,
name = "advanceProject"
)})
public abstract class Project {
private Long id;
private Long version;
}
#Entity
#DiscriminatorValue("AD")
public class AdvanceProject extends Project {
private String name;
private String description;
}
#Entity
#DiscriminatorValue("BS")
public class BasicProject extends Project {
private String name;
}
I don't think you will make it work. I was trying and these are results of my investigation, maybe it will be helpful.
First of all, as #Faron noticed, the #JsonFilterannotation is applied for the class being annotated not a field.
Secondly, I see things this way. Let's imagine, somewhere in Jackson internals you are able to get the actual field. You can figure out if there is the annotation using Java Reflection API. You can even get the filter name. Then you get to the filter and pass the field value there. But it happens at runtime, how will you get the corresponding JsonSerializer of the field type if you decide to serialize the field? It is impossible because of type erasure.
The only alternative I see is to forget about dynamic logic. Then you can do the following things:
1) extend JacksonAnnotationIntrospector (almost the same as implement AnnotationIntrospector but no useless default code) overriding hasIgnoreMarker method. Take a look at this answer
2) criminal starts here. Kinda weird way taking into account your initial goal but still: extend BeanSerializerModifier and filter out fields there. An example can be found here. This way you can define serializer that actually doesn't serialize anything (again, I understand how strange it is but maybe one will find it helpful)
3) similar to the approach above: define useless serializer based on BeanDescription implementing ContextualSerializer's createContextual method. The example of this magic is here
Thanks to this really good blog, I was able to use #JsonView to filter out specific fields from an entity bean based on some custom logic that is determined at runtime.
Since the #JsonFilter does not apply for the fields within a class, I found this to be a cleaner workaround.
Here is the sample code:
#Data
#AllArgsConstructor
public class TestEntity {
private String a;
#JsonView(CustomViews.SecureAccess.class)
private Date b;
#JsonView(CustomViews.SecureAccess.class)
private Integer c;
private List<String> d;
}
public class CustomViews {
public static interface GeneralAccess {}
public static interface SecureAccess {}
public static class GeneralAccessClass implements GeneralAccess {}
public static class SecureAccessClass implements SecureAccess, GeneralAccess {}
public static Class getWriterView(final boolean hasSecureAccess) {
return hasSecureAccess
? SecureAccessClass.class
: GeneralAccessClass.class;
}
}
#Test
public void test() throws JsonProcessingException {
final boolean hasSecureAccess = false; // Custom logic resolved to a boolean value at runtime.
final TestEntity testEntity = new TestEntity("1", new Date(), 2, ImmutableList.of("3", "4", "5"));
final ObjectMapper objectMapper = new ObjectMapper().enable(MapperFeature.DEFAULT_VIEW_INCLUSION);
final String serializedValue = objectMapper
.writerWithView(CustomViews.getWriterView(hasSecureAccess))
.writeValueAsString(testEntity);
Assert.assertTrue(serializedValue.contains("a"));
Assert.assertFalse(serializedValue.contains("b"));
Assert.assertFalse(serializedValue.contains("c"));
Assert.assertTrue(serializedValue.contains("d"));
}

Spring boot caching in #Service class does not work

I have problems with save some values in #Service method.
My code:
#Service(value = "SettingsService")
public class SettingsService {
...
public String getGlobalSettingsValue(Settings setting) {
getTotalEhCacheSize();
if(!setting.getGlobal()){
throw new IllegalStateException(setting.name() + " is not global setting");
}
GlobalSettings globalSettings = globalSettingsRepository.findBySetting(setting);
if(globalSettings != null)
return globalSettings.getValue();
else
return getGlobalEnumValue(setting)
}
#Cacheable(value = "noTimeCache", key = "#setting.name()")
public String getGlobalEnumValue(Settings setting) {
return Settings.valueOf(setting.name()).getDefaultValue();
}
My repository class:
#Repository
public interface GlobalSettingsRepository extends CrudRepository<GlobalSettings, Settings> {
#Cacheable(value = "noTimeCache", key = "#setting.name()", unless="#result == null")
GlobalSettings findBySetting(Settings setting);
It should work like this:
get value form DB if data exist,
if not save value from enum.
but it didn't save any data from DB or enum.
My cache config:
#Configuration
#EnableCaching
public class CacheConfig {
#Bean
public EhCacheCacheManager cacheManager(CacheManager cm) {
return new EhCacheCacheManager(cm);
}
#Bean
public EhCacheManagerFactoryBean ehcache() {
EhCacheManagerFactoryBean ehCacheManagerFactoryBean = new EhCacheManagerFactoryBean();
ehCacheManagerFactoryBean.setConfigLocation(new ClassPathResource("ehcache.xml"));
return ehCacheManagerFactoryBean;
}
}
I have some example to make sure that cache is working in my project in rest method:
#RequestMapping(value = "/system/status", method = RequestMethod.GET, produces = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<?> systemStatus() {
Object[] list = userPuzzleRepository.getAverageResponseByDateBetween(startDate, endDate);
...
}
public interface UserPuzzleRepository extends CrudRepository<UserPuzzle, Long> {
#Cacheable(value = "averageTimeAnswer", key = "#startDate")
#Query("select AVG(case when up.status='SUCCESS' OR up.status='FAILURE' OR up.status='TO_CHECK' then up.solvedTime else null end) from UserPuzzle up where up.solvedDate BETWEEN ?1 AND ?2")
Object[] getAverageResponseByDateBetween(Timestamp startDate, Timestamp endDate);
and it's work well.
What am I doing wwrong?
You have two methods in your SettingsService, one that is cached (getGlobalEnumValue(...)) and another one that isn't cached, but calls the other method (getGlobalSettingsValue(...)).
The way the Spring cache abstraction works however is by proxying your class (using Spring AOP). However, calls to methods within the same class will not call the proxied logic, but the direct business logic beneath. This means caching does not work if you're calling methods in the same bean.
So, if you're calling getGlobalSettingsValue(), it will not populate, nor use the cache when that method calls getGlobalEnumValue(...).
The possible solutions are:
Not calling another method in the same class when using proxies
Caching the other method as well
Using AspectJ rather than Spring AOP, which weaves the code directly into the byte code at compile time, rather than proxying the class. You can switch the mode by setting the #EnableCaching(mode = AdviceMode.ASPECTJ). However, you'll have to set up load time weaving as well.
Autowire the service into your service, and use that service rather than calling the method directly. By autowiring the service, you inject the proxy into your service.
The problem is in the place you call your cacheable method from. When you call your #Cacheable method from same class, you just call it from this reference, which means it doesn't wrapped by Spring's proxy, so Spring can't catch your invocation to handle it.
One on ways to solve this problem is to #Autowired service to itself and just call methods you expected spring have to handle by this reference:
#Service(value = "SettingsService")
public class SettingsService {
//...
#Autowired
private SettingsService settingsService;
//...
public String getGlobalSettingsValue(Settings setting) {
// ...
return settingsSerive.getGlobalEnumValue(setting)
//-----------------------^Look Here
}
#Cacheable(value = "noTimeCache", key = "#setting.name()")
public String getGlobalEnumValue(Settings setting) {
return Settings.valueOf(setting.name()).getDefaultValue();
}
}
But if you have such problems it means your classes are take on too much and aren't comply with the principle of "single class - single responsibility". The better solution would be to move method with #Cacheable to dedicated class.

spring redis wrong result

I am using Spring Redis with the #Cacheable annotiation for two methods. When I call one method I am getting a result cached for the other method.
How can it happen that I get the result from the wrong cache while I configured a different cache for each method using the #Cachebale annotation?
Setup: Spring Version 4.1.6. Redis data 1.5 and Redis client 2.7.0.
Example code:
#Cacheable("test1")
public List<String> findSgsns() {
}
#Cacheable("test2")
public List<String> findSgsns2() {
}
The problem was sloved by adding following setting to spring configuration (set usePrefix):
<bean
id="cacheManager"
class="org.springframework.data.redis.cache.RedisCacheManager"
c:template-ref="redisTemplate">
<property name="usePrefix" value="true" />
</bean>
By default, Spring use SimpleKeyGenerator to generate the key if you don't specify it in the #Cacheable annotation.
public class SimpleKeyGenerator implements KeyGenerator {
#Override
public Object generate(Object target, Method method, Object... params)
{
return generateKey(params);
}
/**
* Generate a key based on the specified parameters.
*/
public static Object generateKey(Object... params) {
if (params.length == 0) {
return SimpleKey.EMPTY;
}
if (params.length == 1) {
Object param = params[0];
if (param != null && !param.getClass().isArray()) {
return param;
}
}
return new SimpleKey(params);
}
}
As there is no method arguments in both of your methods (findSgsns() and findSgsns2()), it essentially will generate the same cache key for both methods.
You've already found a solution which basically utilize usePrefix property in the redisTemplate bean, which essentially add you value ( namely, "test1" and "test2") you specified in your #Cacheable annotation when it forms the cache key in Redis. I would like to mention 2 more alternatives for the sake of completeness here:
Specify your own key for each method (Note: you can use Spring EL to specify your keys):
#Cacheable(value = "test1", key = "key1")
public List<String> findSgsns() {
}
#Cacheable(value = "test2", key = "key2")
public List<String> findSgsns2() {
}
Build a custom key generator, and below is sample key generator which takes method name into redis cache key generation (Note: the custom key generator will take effect automatically by extending CachingConfigurerSupport class):
#Configuration
public class RedisConfig extends CachingConfigurerSupport {
#Bean
public KeyGenerator keyGenerator() {
return new KeyGenerator() {
#Override
public Object generate(Object target, Method method, Object... params) {
StringBuilder sb = new StringBuilder();
sb.append(target.getClass().getName());
sb.append(method.getName());
for (Object obj : params) {
sb.append(obj.toString());
}
return sb.toString();
}
};
}
}

Spring force #Cacheable to use putifAbsent instead of put

I've Spring cache implemented as below
#Component
public class KPCacheExample {
private static final Logger LOG = LoggerFactory.getLogger(KPCacheExample.class);
#CachePut(value="kpCache")
public String saveCache(String userName, String password){
LOG.info("Called saveCache");
return userName;
}
#Cacheable(value="kpCache")
public String getCache(String userName, String password){
LOG.info("Called getCache");
return "kp";
}
}
And Java Config file
#Configuration
#ComponentScan(basePackages={"com.kp"})
public class GuavaCacheConfiguration {
#Bean
public CacheManager cacheManager() {
GuavaCacheManager guavaCacheManager = new GuavaCacheManager("kpCache");
guavaCacheManager.setCacheBuilder(CacheBuilder.newBuilder().expireAfterAccess(2000, TimeUnit.MILLISECONDS).removalListener(new KPRemovalListener()));
return guavaCacheManager;
}
}
By default the spring uses put method in the cache interface to update/put values in the cache. How can I force the spring to use putifabsent method to be invoked, such that I can get null value if cache is missed or in other wards first request to the method with unique username and password should return null and subsequent request to that username and password should return username.
Well, looking through Spring's Cache Abstraction source, there does not appear to be a configuration setting (switch) to default the #CachePut to use the "atomic" putIfAbsent operation.
You might be able to simulate the "putIfAbsent" using the unless (or condition) attribute(s) of the #CachePut annotation, something like (based on the Guava impl)...
#CachePut(value="Users", key="#user.name" unless="#root.caches[0].getIfPresent(#user.name) != null")
public User save(User user){
return userRepo.save(user);
}
Also note, I did not test this expression, and it would not be "atomic" or portable using a different Cache impl. The expression ("#root.caches[0].get(#user.name) != null") maybe more portable.
Giving up the "atomic" property may not be desirable so you could also extend the (Guava)CacheManager to return a "custom" Cache (based on GuavaCache) that overrides the put operation to delegate to "putIfAbsent" instead...
class CustomGuavaCache extends GuavaCache {
CustomGuavaCache(String name, com.google.common.cache.Cache<Object, Object> cache, boolean allowNullValues) {
super(name, cache, allowNullValues);
}
#Override
public void put(Object key, Object value) {
putIfAbsent(key, value);
}
}
See the GuavaCache class for more details.
Then...
class CustomGuavaCacheManager extends GuavaCacheManager {
#Override
protected Cache createGuavaCache(String name) {
return new CustomGuavaCache(name, createNativeGuavaCache(name), isAllowNullValues());
}
}
See GuavaCacheManager for further details, and specifically, have a look at line 93 and createGuavaCache(String name).
Hope this helps, or at least gives you some more ideas.

How to define Spring Data Repository scope to Prototype?

I'm using Spring data jpa & hibernate for data access along with Spring boot. All the repository beans are singleton by default. I want to define the scope of all my repositories to Prototype. How can I do that?
#Repository
public interface CustomerRepository extends CrudRepository<Customer, Long> {
List<Customer> findByLastName(String lastName);
}
Edit 1
The problem is related to domain object being shared in 2 different transactions which is causing my code to fail. I thought it is happening because repository beans are singleton. That's the reason I asked the question. Here is the detailed explanation of the scenario.
I have 2 entities User and UserSkill. User has 1-* relationship with UserSkills with lazy loading enabled on UserSkill relation.
In a UserAggregationService, I first make a call to fetch an individual user skill by id 123 which belongs to user with id 1.
public class UserAggregationService {
public List<Object> getAggregatedResults() {
resultList.add(userSkillService.getUserSkill(123));
//Throws Null Pointer Exception. See below for more details.
resultList.add(userService.get(1));
}
}
Implementation of UserSkillService method looks like
#Override
public UserSkillDTO getUserSkill(String id) {
UserSkill userSkill = userSkillService.get(id);
//Skills set to null avoid recursive DTO mapping. Dozer mapper is used
//for mapping.
userSkill.getUser().setSkills(null);
UserSkillDTO result = mapper.map(userSkill, UserSkillDTO.class);
return result;
}
In the call of user aggregation service, I call UserService to fetch userDetails. UserService code looks like
#Override
public UserDTO getById(String id) {
User user = userService.getByGuid(id);
List<UserSkillDTO> userSkillList = Lists.newArrayList();
//user.getSkills throws null pointer exception.
for (UserSkill uSkill : user.getSkills()) {
//Code emitted
}
....
//code removed for conciseness
return userDTO;
}
UserSkillService method implementation
public class UserSkillService {
#Override
#Transactional(propagation = Propagation.SUPPORTS)
public UserSkill get(String guid) throws PostNotFoundException {
UserSkill skill = userSkillRepository.findByGuid(guid);
if (skill == null) {
throw new SkillNotFoundException(guid);
}
return skill;
}
}
UserService method implementation:
public class UserService {
#Override
#Transactional(readOnly = true)
public User getByGuid(String guid) throws UserNotFoundException {
User user = userRepo.findByGuid(guid);
if (user == null) {
throw new UserNotFoundException(guid);
}
return user;
}
}
Spring boot auto configuration is used to instantiate entity manager factory and transaction manager. In the configuration file spring.jpa.* keys are used to connect to the database.
If I comment the below line of code, then I do not get the exception. I am unable to understand why change in the domain object is being affecting the object fetch in a different transaction.
userSkill.getUser().setSkills(null);
Please suggest If I have missed something.

Resources