I'm using Spring Boot with Mybatis, and I noticed that everytime I fetch some data with the same query, it will connect to db and query without using cache. So I search the solution for Mybatis with Redis, but cannot find one. All the answer is for Spring and xml configuration file, and I think it's better to use annotation. So, how to configure Redis as Cache to Mybatis in Spring Boot.
here is one of my solution, it just doesn't work.
MybatisRedisCache:
public class MybatisRedisCache implements Cache {
private static Logger logger = Logger.getLogger(MybatisRedisCache.class);
private Jedis redisClient = createClient();
/** The ReadWriteLock. */
private final ReadWriteLock readWriteLock = new ReentrantReadWriteLock();
private String id;
public MybatisRedisCache(final String id) {
if (id == null) {
throw new IllegalArgumentException("Cache instances require an ID");
}
logger.debug("MybatisRedisCache:id=" + id);
this.id = id;
}
#Override
public String getId() {
return this.id;
}
#Override
public int getSize() {
return Integer.valueOf(redisClient.dbSize().toString());
}
#Override
public void putObject(Object key, Object value) {
logger.debug("putObject:" + key + "=" + value);
redisClient.set(SerializeUtil.serialize(key.toString()), SerializeUtil.serialize(value));
}
#Override
public Object getObject(Object key) {
Object value = SerializeUtil.unserialize(redisClient.get(SerializeUtil.serialize(key.toString())));
logger.debug("getObject:" + key + "=" + value);
return value;
}
#Override
public Object removeObject(Object key) {
return redisClient.expire(SerializeUtil.serialize(key.toString()), 0);
}
#Override
public void clear() {
redisClient.flushDB();
}
#Override
public ReadWriteLock getReadWriteLock() {
return readWriteLock;
}
protected static Jedis createClient() {
try {
JedisPool pool = new JedisPool(new JedisPoolConfig(), "127.0.0.1");
return pool.getResource();
} catch (Exception e) {
e.printStackTrace();
}
throw new RuntimeException("connect failed");
}
}
And here is LoggingRedisCache
public class LoggingRedisCache extends LoggingCache {
public LoggingRedisCache(String id) {
super(new MybatisRedisCache(id));
}
}
Related
Trying to Implement infinispan base cache on spring boot using custom annotation:
#Aspect
#Configuration
#Slf4j
public class CacheAnnotationAspect {
Logger logger = LoggerFactory.getLogger(CacheAnnotationAspect.class);
#Autowired
InfinispanCacheService cacheService;
#Around("#annotation(com.calsoft.lib.cache.CacheResult)")
public Object cacheResult(ProceedingJoinPoint joinPoint)throws Throwable{
logger.info("Cache Operation :: CacheResult annotation advice invoked...");
CacheResult cacheResult=(CacheResult) getAnnotation(joinPoint,CacheResult.class);
CacheConfig cacheConfig=CacheConfig.from(cacheResult);
Object resultFromCache=getFromCache(joinPoint,cacheConfig);
if(resultFromCache!= null){
return resultFromCache;
}
Object result=joinPoint.proceed(joinPoint.getArgs());
storeInCache(result,joinPoint,cacheConfig);
return result;
}
private void storeInCache(Object result, ProceedingJoinPoint joinPoint, CacheConfig cacheConfig) {
if(result==null){
log.info("Cache op :: null values not cached");
return;
}
CacheService cacheService=getCacheService();
if(cacheService==null){
logger.info("Cache op :: CacheGet Failed : No CacheService available for use..");
}
DefaultCacheKey defaultCacheKey=getKey(joinPoint,cacheConfig);
String cacheName=getCacheName(cacheConfig.getCacheName(),joinPoint);
long lifeSpan=cacheConfig.getLifespan();
if(lifeSpan== CommonConstant.CACHE_DEFAULT_LIFE){
cacheService.put(cacheName,defaultCacheKey,result);
}else{
cacheService.put(cacheName,defaultCacheKey,result,lifeSpan,cacheConfig.getUnit());
}
logger.info("Cache Op :: Result cached :: {} ",cacheConfig);
}
private DefaultCacheKey getKey(ProceedingJoinPoint joinPoint, CacheConfig cacheConfig) {
List<Object> keys=new ArrayList<>();
Object target=joinPoint.getTarget();
MethodSignature methodSignature=MethodSignature.class.cast(joinPoint.getSignature());
Method method=methodSignature.getMethod();
Annotation[][] parameterAnnotations=method.getParameterAnnotations();
if(isEmpty(trim(cacheConfig.getKeyPrefix()))){
keys.add(target.getClass().getName());
keys.add(method.getName());
}else{
keys.add(cacheConfig.getKeyPrefix());
}
if(isCacheKeySpecified(parameterAnnotations)){
keys.addAll(getCacheKeys(joinPoint,parameterAnnotations));
}else{
keys.addAll(Arrays.asList(joinPoint.getArgs()));
}
return new DefaultCacheKey(keys.toArray());
}
private Collection<?> getCacheKeys(ProceedingJoinPoint joinPoint, Annotation[][] parameterAnnotations) {
Object[] args=joinPoint.getArgs();
List<Object> result=new ArrayList<>();
int i=0;
for(Annotation[] annotations: parameterAnnotations){
for(Annotation annotation: annotations){
if(annotation instanceof CacheKey){
result.add(args[i]);
break;
}
}
i++;
}
return result;
}
private boolean isCacheKeySpecified(Annotation[][] parameterAnnotations) {
for(Annotation[] annotations:parameterAnnotations){
for(Annotation annotation:annotations){
if(annotation instanceof CacheKey) {
return true;
}
}
}
return false;
}
private Object getFromCache(ProceedingJoinPoint joinPoint, CacheConfig cacheConfig) {
CacheService cacheService = getCacheService();
if (cacheService == null) {
logger.info("Cache op :: CacheGet Failed : No CacheService available for use..");
}
String cacheName=getCacheName(cacheConfig.getCacheName(),joinPoint);
DefaultCacheKey defaultCacheKey=getKey(joinPoint,cacheConfig);
return cacheService.get(cacheName,defaultCacheKey);
}
private String getCacheName(String cacheName, ProceedingJoinPoint joinPoint) {
boolean nameNotDefined=isEmpty(trim(cacheName));
if(nameNotDefined){
logger.error("Cache op :: Cache Name not defined");
}else{
CacheService cacheService=getCacheService();
if(!cacheService.cacheExists(cacheName)){
throw new RuntimeException("Cache with the name "+ cacheName+" does not exists");
}
}
return cacheName;
}
private CacheService getCacheService() {
return cacheService;
}
private Annotation getAnnotation(ProceedingJoinPoint joinPoint, Class type) {
MethodSignature methodSignature=MethodSignature.class.cast(joinPoint.getSignature());
Method method=methodSignature.getMethod();
return method.getAnnotation(type);
}
}
Above class << CacheAnnotationAspect >> is custom annotation #CacheResult Aspect implementation where it will first try to retrieve from cache and if not found will make actual dao call and then store in cache.
Below is the of the implementation of InfinispanCacheService which invokes cachemager to get/put cache entries.
#Service
public class InfinispanCacheService implements CacheService {
Logger logger = LoggerFactory.getLogger(InfinispanCacheService.class);
#Autowired
private DefaultCacheManagerWrapper cacheManagerWrapper;
private DefaultCacheManager infiniCacheManager;
private DefaultCacheManager initializeCacheManager(){
if(infiniCacheManager==null){
infiniCacheManager=cacheManagerWrapper.getCacheManager();
}
return infiniCacheManager;
}
#PostConstruct
public void start(){
logger.info("Initializing...InifinispanCacheService ....");
initializeCacheManager();
for(String cacheName : infiniCacheManager.getCacheNames()){
infiniCacheManager.startCache(cacheName);
}
}
#Override
public Object get(String cacheName, Object key) {
return getCache(cacheName).get(key);
}
#Override
public void put(String cacheName, Object key, Object value, long lifespan, TimeUnit unit) {
Cache cache=getCache(cacheName);
cache.put(key,value,lifespan,unit);
}
#Override
public void put(String cacheName, Object key, Object value) {
Cache cache=getCache(cacheName);
cache.put(key,value);
}
private Cache<Object,Object> getCache(String cacheName) {
Cache<Object,Object> cache;
if(isEmpty(trim(cacheName))){
cache=infiniCacheManager.getCache();
}else{
cache=infiniCacheManager.getCache(cacheName,false);
}
return cache;
}
#Override
public boolean cacheExists(String cacheName) {
return infiniCacheManager.cacheExists(cacheName);
}
}
<<<<<< The DefaultCacheManager below is one which during startup initializes the DefaultCacheManager by loading the infispan.xml configuration >>>>>
#Component
public class DefaultCacheManagerWrapper {
Logger logger = LoggerFactory.getLogger(DefaultCacheManagerWrapper.class);
// #Value("${classpath:spring.cache.infinispan.config}")
private String fileName="file:\\calsoft\\devlabs\\ecom2\\ecom-svc-admin\\src\\main\\resources\\infinispan.xml";
private DefaultCacheManager infiniCacheManager;
#PostConstruct
public void start(){
logger.info(" Received File Name :: {} ",fileName);
try{
URL fileUrl=new URL(fileName);
URLConnection urlConnection=fileUrl.openConnection();
InputStream inputStream=urlConnection.getInputStream();
infiniCacheManager=new DefaultCacheManager(inputStream);
infiniCacheManager.start();
logger.info("Cache Manager Initialized....");
}catch(MalformedURLException mue){
logger.error("Error creating file url ",mue.getMessage());
} catch (IOException e) {
logger.error("Error creating file url ",e.getMessage());
}
}
public void stop() { infiniCacheManager.stop();}
public DefaultCacheManager getCacheManager(){
return infiniCacheManager;
}
}
<<<< Infinispan.xml configuration >>
<?xml version="1.0" encoding="UTF-8"?>
<infinispan xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:scehmaLocation="
urn:infinispan:config:7.2
http://www.infinispan.org/schemas/infinispan-config-7.2.xsd"
xmlns="urn:infinispan:config:7.2">
<cache-container default-cache="attributeset-cache">
<!-- template configurations -->
<local-cache-configuration name="local-template">
<expiration interval="10000" lifespan="50000" max-idle="50000"/>
</local-cache-configuration>
<!-- cache definitions -->
<local-cache name="attributeset-cache" configuration="local-template"/>
</cache-container>
</infinispan>
Annotation at controller level:
#CacheResult(cacheName= CommonConstant.ATTRIBUTE_SET_CACHE,lifespan=10,unit = TimeUnit.MINUTES)
#GetMapping("/eavattributeset")
public List<EavAttributeSet> fetchAllAttributes() {
return eavAttributeService.fetchAllEavattributesets();
}
<< EavAttributeService >>
#Service
public class EavAttributeService {
Logger logger = LoggerFactory.getLogger(EavAttributeService.class);
#Autowired
private EavAttributeJpaRepository eavAttributeJpaRepository;
#Autowired
EavAttributeSetJpaRepository eavAttributeSetJpaRepository;
public List<EavAttributeSet> fetchAllEavattributesets() {
return eavAttributeSetJpaRepository.findAll();
}
}
<< CacheConfig >>
#Data
#Slf4j
#AllArgsConstructor
#NoArgsConstructor
public class CacheConfig {
private String cacheName;
private long lifespan;
private TimeUnit unit;
private String keyPrefix;
public static CacheConfig from(CacheResult cacheResult) {
return new CacheConfig(cacheResult.cacheName(), cacheResult.lifespan(), cacheResult.unit(), cacheResult.keyPrefix());
}
}
Issue : The data is not getting cache, Wherever #CacheResult annotation is used the CacheAnnotationAspect is getting invoked and the check for data also happens in cache but when it tries to store the data in cache it does not cache and every subsequent call of this method does not return any data.
It works fine when i try with below configuration on infinispan.xml.
<expiration lifespan="50000"/>
Can you try with the above config and see if it using the cached data.
I guess it could be the max-idle timeout (10 milliseconds) which could be issue.
I am running a Spring Boot application with a PostConstruct method to populate a POJO before application initialization. This is to ensure that the database isn't hit by multiple requests to get the POJO content after it starts running.
I'm able to pull the data from Oracle database through Hibernate query and store it in my POJO. The problem arises when I try to access the stored data. The dataset contains a list of objects that contain strings and numbers. Just trying to print the description of the object at the top of the list raises a class cast exception. How should I mitigate this issue?
#Autowired
private TaskDescrBean taskBean;
#PostConstruct
public void loadDescriptions() {
TaskDataLoader taskData = new TaskDataLoader(taskBean.acquireDataSourceParams());
List<TaskDescription> taskList = tdf.getTaskDescription();
taskBean.setTaskDescriptionList(taskList);
System.out.println("Task description size: " + taskBean.getTaskDescriptionList().get(0).getTaskDescription());
}
My POJO class:
#Component
public class TaskDescrBean implements ApplicationContextAware {
#Resource
private Environment environment;
protected List<TaskDescription> taskDescriptionList;
public Properties acquireDataSourceParams() {
Properties dataSource = new Properties();
dataSource.setProperty("hibernate.connection.driver_class", environment.getProperty("spring.datasource.driver-class-name"));
dataSource.setProperty("hibernate.connection.url", environment.getProperty("spring.datasource.url"));
dataSource.setProperty("hibernate.connection.username", environment.getProperty("spring.datasource.username"));
dataSource.setProperty("hibernate.connection.password", environment.getProperty("spring.datasource.password"));
return dataSource;
}
public List<TaskDescription> getTaskDescriptionList() {
return taskDescriptionList;
}
public void setTaskDescriptionList(List<TaskDescription> taskDescriptionList) {
this.taskDescriptionList = taskDescriptionList;
}
public ApplicationContext getApplicationContext() {
return applicationContext;
}
public void setApplicationContext(ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
}
}
My DAO class:
public class TaskDataLoader {
private Session session;
private SessionFactory sessionFactory;
public TaskDataLoader(Properties connectionProperties) {
Configuration config = new Configuration().setProperties(connectionProperties);
config.addAnnotatedClass(TaskDescription.class);
sessionFactory = config.buildSessionFactory();
}
#SuppressWarnings("unchecked")
public List<TaskDescription> getTaskDescription() {
List<TaskDescription> taskList = null;
session = sessionFactory.openSession();
try {
String description = "from TaskDescription des";
Query taskDescriptionQuery = session.createQuery(description);
taskList = taskDescriptionQuery.list();
System.out.println("Task description fetched. " + taskList.getClass());
} catch (Exception e) {
e.printStackTrace();
} finally {
session.close();
}
return taskList;
}
TaskDescription Entity:
#Entity
#Table(name="TASK_DESCRIPTION")
#JsonIgnoreProperties
public class TaskDescription implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Column(name="TASK_DESCRIPTION_ID")
private Long taskDescriptionId;
#Column(name="TASK_DESCRIPTION")
private String taskDescription;
public Long getTaskDescriptionId() {
return taskDescriptionId;
}
public void setTaskDescriptionId(Long taskDescriptionId) {
this.taskDescriptionId = taskDescriptionId;
}
public String getTaskDescription() {
return taskDescription;
}
public void setTaskDescription(String taskDescription) {
this.taskDescription = taskDescription;
}
}
StackTrace
Instead of sending the List in the return statement, I transformed it into a JSON object and sent its String representation which I mapped back to the Object after transforming it using mapper.readValue()
I've a BeanDefinitionRegistryPostProcessor class that registers beans dynamically. Sometimes, the beans being registered have the Spring Cloud annotation #RefreshScope.
However, when the cloud configuration Environment is changed, such beans are not being refreshed. Upon debugging, the appropriate application events are triggered, however, the dynamic beans don't get reinstantiated. Need some help around this. Below is my code:
TestDynaProps:
public class TestDynaProps {
private String prop;
private String value;
public String getProp() {
return prop;
}
public void setProp(String prop) {
this.prop = prop;
}
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
#Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder.append("TestDynaProps [prop=").append(prop).append(", value=").append(value).append("]");
return builder.toString();
}
}
TestDynaPropConsumer:
#RefreshScope
public class TestDynaPropConsumer {
private TestDynaProps props;
public void setProps(TestDynaProps props) {
this.props = props;
}
#PostConstruct
public void init() {
System.out.println("Init props : " + props);
}
public String getVal() {
return props.getValue();
}
}
BeanDefinitionRegistryPostProcessor:
public class PropertyBasedDynamicBeanDefinitionRegistrar implements BeanDefinitionRegistryPostProcessor, EnvironmentAware {
private ConfigurableEnvironment environment;
private final Class<?> propertyConfigurationClass;
private final String propertyBeanNamePrefix;
private final String propertyKeysPropertyName;
private Class<?> propertyConsumerBean;
private String consumerBeanNamePrefix;
private List<String> dynaBeans;
public PropertyBasedDynamicBeanDefinitionRegistrar(Class<?> propertyConfigurationClass,
String propertyBeanNamePrefix, String propertyKeysPropertyName) {
this.propertyConfigurationClass = propertyConfigurationClass;
this.propertyBeanNamePrefix = propertyBeanNamePrefix;
this.propertyKeysPropertyName = propertyKeysPropertyName;
dynaBeans = new ArrayList<>();
}
public void setPropertyConsumerBean(Class<?> propertyConsumerBean, String consumerBeanNamePrefix) {
this.propertyConsumerBean = propertyConsumerBean;
this.consumerBeanNamePrefix = consumerBeanNamePrefix;
}
#Override
public void setEnvironment(Environment environment) {
this.environment = (ConfigurableEnvironment) environment;
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory arg0) throws BeansException {
}
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry beanDefRegistry) throws BeansException {
if (environment == null) {
throw new BeanCreationException("Environment must be set to initialize dyna bean");
}
String[] keys = getPropertyKeys();
Map<String, String> propertyKeyBeanNameMapping = new HashMap<>();
for (String k : keys) {
String trimmedKey = k.trim();
String propBeanName = getPropertyBeanName(trimmedKey);
registerPropertyBean(beanDefRegistry, trimmedKey, propBeanName);
propertyKeyBeanNameMapping.put(trimmedKey, propBeanName);
}
if (propertyConsumerBean != null) {
String beanPropertyFieldName = getConsumerBeanPropertyVariable();
for (Map.Entry<String, String> prop : propertyKeyBeanNameMapping.entrySet()) {
registerConsumerBean(beanDefRegistry, prop.getKey(), prop.getValue(), beanPropertyFieldName);
}
}
}
private void registerConsumerBean(BeanDefinitionRegistry beanDefRegistry, String trimmedKey, String propBeanName, String beanPropertyFieldName) {
String consumerBeanName = getConsumerBeanName(trimmedKey);
AbstractBeanDefinition consumerDefinition = preparePropertyConsumerBeanDefinition(propBeanName, beanPropertyFieldName);
beanDefRegistry.registerBeanDefinition(consumerBeanName, consumerDefinition);
dynaBeans.add(consumerBeanName);
}
private void registerPropertyBean(BeanDefinitionRegistry beanDefRegistry, String trimmedKey, String propBeanName) {
AbstractBeanDefinition propertyBeanDefinition = preparePropertyBeanDefinition(trimmedKey);
beanDefRegistry.registerBeanDefinition(propBeanName, propertyBeanDefinition);
dynaBeans.add(propBeanName);
}
private String getConsumerBeanPropertyVariable() throws IllegalArgumentException {
Field[] beanFields = propertyConsumerBean.getDeclaredFields();
for (Field bField : beanFields) {
if (bField.getType().equals(propertyConfigurationClass)) {
return bField.getName();
}
}
throw new BeanCreationException(String.format("Could not find property of type %s in bean class %s",
propertyConfigurationClass.getName(), propertyConsumerBean.getName()));
}
private AbstractBeanDefinition preparePropertyBeanDefinition(String trimmedKey) {
BeanDefinitionBuilder bdb = BeanDefinitionBuilder.genericBeanDefinition(PropertiesConfigurationFactory.class);
bdb.addConstructorArgValue(propertyConfigurationClass);
bdb.addPropertyValue("propertySources", environment.getPropertySources());
bdb.addPropertyValue("conversionService", environment.getConversionService());
bdb.addPropertyValue("targetName", trimmedKey);
return bdb.getBeanDefinition();
}
private AbstractBeanDefinition preparePropertyConsumerBeanDefinition(String propBeanName, String beanPropertyFieldName) {
BeanDefinitionBuilder bdb = BeanDefinitionBuilder.genericBeanDefinition(propertyConsumerBean);
bdb.addPropertyReference(beanPropertyFieldName, propBeanName);
return bdb.getBeanDefinition();
}
private String getPropertyBeanName(String trimmedKey) {
return propertyBeanNamePrefix + trimmedKey.substring(0, 1).toUpperCase() + trimmedKey.substring(1);
}
private String getConsumerBeanName(String trimmedKey) {
return consumerBeanNamePrefix + trimmedKey.substring(0, 1).toUpperCase() + trimmedKey.substring(1);
}
private String[] getPropertyKeys() {
String keysProp = environment.getProperty(propertyKeysPropertyName);
return keysProp.split(",");
}
The Config class:
#Configuration
public class DynaPropsConfig {
#Bean
public PropertyBasedDynamicBeanDefinitionRegistrar dynaRegistrar() {
PropertyBasedDynamicBeanDefinitionRegistrar registrar = new PropertyBasedDynamicBeanDefinitionRegistrar(TestDynaProps.class, "testDynaProp", "dyna.props");
registrar.setPropertyConsumerBean(TestDynaPropConsumer.class, "testDynaPropsConsumer");
return registrar;
}
}
Application.java
#SpringBootApplication
#EnableDiscoveryClient
#EnableScheduling
public class Application extends SpringBootServletInitializer {
private static Class<Application> applicationClass = Application.class;
public static void main(String[] args) {
SpringApplication sa = new SpringApplication(applicationClass);
sa.run(args);
}
}
And, my bootstrap.properties:
spring.cloud.consul.enabled=true
spring.cloud.consul.config.enabled=true
spring.cloud.consul.config.format=PROPERTIES
spring.cloud.consul.config.watch.delay=15000
spring.cloud.discovery.client.health-indicator.enabled=false
spring.cloud.discovery.client.composite-indicator.enabled=false
application.properties
dyna.props=d1,d2
d1.prop=d1prop
d1.value=d1value
d2.prop=d2prop
d2.value=d2value
Here are some guesses:
1) Perhaps the #RefreshScope metadata is not being passed to your metadata for the bean definition. Call setScope()?
2) The RefreshScope is actually implemented by https://github.com/spring-cloud/spring-cloud-commons/blob/master/spring-cloud-context/src/main/java/org/springframework/cloud/context/scope/refresh/RefreshScope.java, which itself implements BeanDefinitionRegistryPostProcessor. Perhaps the ordering of these two post processors is issue.
Just guesses.
We finally resolved this by appending the #RefreshScope annotation on the proposed dynamic bean classes using ByteBuddy and then, adding them to Spring Context using Bean Definition Post Processor.
The Post Processor is added to spring.factories so that it loads before any other dynamic bean dependent beans.
I am developing a small cqrs implementation and I am very new to it.
I want to segregate each handlers(Command and Event) from aggregate and
make sure all are working well. The command handler are getting triggered
from controller but from there event handlers are not triggered. Could
anyone Please help on this.
public class User extends AbstractAnnotatedAggregateRoot<String> {
/**
*
*/
private static final long serialVersionUID = 1L;
#AggregateIdentifier
private String userId;
private String userName;
private String age;
public User() {
}
public User(String userid) {
this.userId=userid;
}
#Override
public String getIdentifier() {
return this.userId;
}
public void createuserEvent(UserCommand command){
apply(new UserEvent(command.getUserId()));
}
#EventSourcingHandler
public void applyAccountCreation(UserEvent event) {
this.userId = event.getUserId();
}
}
public class UserCommand {
private final String userId;
public UserCommand(String userid) {
this.userId = userid;
}
public String getUserId() {
return userId;
}
}
#Component
public class UserCommandHandler {
#CommandHandler
public void userCreateCommand(UserCommand command) {
User user = new User(command.getUserId());
user.createuserEvent(command);
}
}
public class UserEvent {
private final String userId;
public UserEvent(String userid) {
this.userId = userid;
}
public String getUserId() {
return userId;
}
}
#Component
public class UserEventHandler {
#EventHandler
public void createUser(UserEvent userEvent) {
System.out.println("Event triggered");
}
}
#Configuration
#AnnotationDriven
public class AppConfiguration {
#Bean
public SimpleCommandBus commandBus() {
SimpleCommandBus simpleCommandBus = new SimpleCommandBus();
return simpleCommandBus;
}
#Bean
public Cluster normalCluster() {
SimpleCluster simpleCluster = new SimpleCluster("simpleCluster");
return simpleCluster;
}
#Bean
public ClusterSelector clusterSelector() {
Map<String, Cluster> clusterMap = new HashMap<>();
clusterMap.put("com.user.event.handler", normalCluster());
//clusterMap.put("exploringaxon.replay", replayCluster());
return new ClassNamePrefixClusterSelector(clusterMap);
}
#Bean
public EventBus clusteringEventBus() {
ClusteringEventBus clusteringEventBus = new ClusteringEventBus(clusterSelector(), terminal());
return clusteringEventBus;
}
#Bean
public EventBusTerminal terminal() {
return new EventBusTerminal() {
#Override
public void publish(EventMessage... events) {
normalCluster().publish(events);
}
#Override
public void onClusterCreated(Cluster cluster) {
}
};
}
#Bean
public DefaultCommandGateway commandGateway() {
return new DefaultCommandGateway(commandBus());
}
#Bean
public Repository<User> eventSourcingRepository() {
EventStore eventStore = new FileSystemEventStore(new SimpleEventFileResolver(new File("D://sevents.txt")));
EventSourcingRepository eventSourcingRepository = new EventSourcingRepository(User.class, eventStore);
eventSourcingRepository.setEventBus(clusteringEventBus());
AnnotationEventListenerAdapter.subscribe(new UserEventHandler(), clusteringEventBus());
return eventSourcingRepository;
}
}
As far as I can tell, the only thing missing is that you aren't adding the User Aggregate to a Repository. By adding it to the Repository, the User is persisted (either by storing the generated events, in the case of Event Sourcing, or its state otherwise) and all Events generated by the Command Handler (including the Aggregate) are published to the Event Bus.
Note that the Aggregate's #EventSourcingHandlers are invoked immediately, but any external #EventHandlers are only invoked after the command handler has been executed.
I'm trying to use String Cache abstraction mechanism with guice modules.
I've created interceptors:
CacheManager cacheManager = createCacheManager();
bind(CacheManager.class).toInstance(cacheManager);
AppCacheInterceptor interceptor = new AppCacheInterceptor(
cacheManager,
createCacheOperationSource()
);
bindInterceptor(
Matchers.any(),
Matchers.annotatedWith(Cacheable.class),
interceptor
);
bindInterceptor(
Matchers.any(),
Matchers.annotatedWith(CacheEvict.class),
interceptor
);
Then, implemented Strings Cache interface and CacheManager, and finally annotated my DAO classes with #Cachable and #CacheEvict:
public class DaoTester {
QssandraConsumer qs;
#CachePut(value = "cached_consumers", key = "#consumer.id")
public void save(QssandraConsumer consumer) {
qs = consumer;
}
#Cacheable(value = "cached_consumers")
public QssandraConsumer get(String id) {
if (id != null) {
qs.getId();
}
return qs;
}
#CacheEvict(value = "cached_consumers", key = "#consumer.id")
public void remove(QssandraConsumer consumer) {
qs = consumer;
}}
Caching is simply fine - no problems here, but when i try to evict(calling remove method in this example), evrything crashes and I see:
Exception in thread "main" org.springframework.expression.spel.SpelEvaluationException: EL1007E:(pos 10): Field or property 'id' cannot be found on null
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:205)
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:72)
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:57)
at org.springframework.expression.spel.ast.SpelNodeImpl.getValue(SpelNodeImpl.java:93)
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:88)
at org.springframework.cache.interceptor.ExpressionEvaluator.key(ExpressionEvaluator.java:80)
at org.springframework.cache.interceptor.CacheAspectSupport$CacheOperationContext.generateKey(CacheAspectSupport.java:464)
at org.springframework.cache.interceptor.CacheAspectSupport.inspectCacheEvicts(CacheAspectSupport.java:260)
at org.springframework.cache.interceptor.CacheAspectSupport.inspectAfterCacheEvicts(CacheAspectSupport.java:232)
at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:215)
at org.springframework.cache.interceptor.CacheInterceptor.invoke(CacheInterceptor.java:66)
at qiwi.qommon.deployment.dao.DaoTester.main(DaoTester.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
What's wrong here?!
BTW, cached object is:
public class QssandraConsumer implements Identifiable<String> {
private String id;
private String host;
#Override
public String getId() {
return id;
}
#Override
public void setId(String id) {
this.id = id;
}
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
#Override
public boolean equals(Object object) {
if (this == object) {
return true;
}
if (null == object) {
return false;
}
if (!(object instanceof QssandraConsumer)) {
return false;
}
QssandraConsumer o = (QssandraConsumer) object;
return
Objects.equal(id, o.id)
&& Objects.equal(host, o.host);
}
#Override
public int hashCode() {
return Objects.hashCode(
id, host
);
}
#Override
public String toString() {
return Objects.toStringHelper(this)
.addValue(id)
.addValue(host)
.toString();
}
}
Finally I figured out what was the reason of the problem:
when injecting a class that uses annotation(which are intercepted, like #Cachable or #CacheEvict) Guice enhances class (AOP make bytecode modification in runtime). So when CacheInterceptor tryed to evaluate key = "#consumer.id" it failed because couldn't find argument name in enhanced class (see: LocalVariableTableParameterNameDiscoverer#inspectClass).
So it will not work in Guice out of the box.
In spring the proxy class is created - so no problems here.