I start to learn Spring Cache abstraction.
I use Spring boot, Spring Data Jpa, EhCache provider for this purpose.
My ehcache.xml:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE ehcache>
<ehcache>
<diskStore path="java.io.tmpdir"/>
<defaultCache maxElementsInMemory="100"
eternal="false"
timeToIdleSeconds="120"
timeToLiveSeconds="120"
overflowToDisk="true">
</defaultCache>
<cache name="teams"
maxElementsInMemory="500"
eternal="true"
timeToIdleSeconds="0"
timeToLiveSeconds="100"
overflowToDisk="false">
</cache>
My service:
#CacheConfig(cacheNames = "teams")
#Service
public class TeamService {
#Autowired
private TeamRepository teamRepository;
#Cacheable
public Team findById(long id) {
return teamRepository.findById(id).get();
}
#Cacheable
public List<Team> findAll() {
return teamRepository.findAll();
}
#CachePut
public Team save(Team team) {
return teamRepository.save(team);
}
#CacheEvict
public void delete(long id) {
teamRepository.deleteById(id);
}
}
My controller:
#RestController
public class TeamController {
#Autowired
private TeamService teamService;
#GetMapping("/teams")
public List<Team> getAll() {
return teamService.findAll();
}
#GetMapping("/team/{id}")
public Team getById(#PathVariable long id) {
return teamService.findById(id);
}
#DeleteMapping("/team/{id}")
public void delete(#PathVariable long id) {
teamService.delete(id);
}
#PostMapping("/team")
public Team save(#RequestBody Team team) {
return teamService.save(team);
}
}
I am performing requests to my controller...
When I perform getAll() method of the controller data are cached correctly and then don't exucute query to database at next times. Then I update and delete data from the database using corresponding methods of my controller, which service methods are marked as #CachePut and #CacheEvict respectively and must refresh cache. Then I perform above getAll() method again and get the same response like at the first time but I want that it will be refreshed after performing delete and update requests.
What's I doing wrong or How I can get the desired result?.
When you put #Cachable annotation on a method so all entries will be kept on cache added by default a name then the first cachable is different to second cachable, so if you want to work well you need to add a name that you want, for example:
#Cachable("teams")
#Cachable("teams")
#CachePut("teams")
#CacheEvict(value="teams", allEntries=true)
You can get more information in this link: https://www.baeldung.com/spring-cache-tutorial
Perhaps a best solution would be this:
#Cachable("team")
#Cachable("teams")
#Caching(put = {
#CachePut(value="team"),
#CachePut(value="teams") })
#Caching(evict = {
#CacheEvict(value="team", allEntries=true),
#CacheEvict(value="teams", allEntries=true) })
Related
I'm working with Spring Boot 2.4.5, MyBatis 3.5.6 and Java 8. When trying to return a String from a #RestController, an obscure error shows up in the returned HttpErrorResponse.
The method tries to obtain the value via MyBatis, i.e., a method in a DAO object that acts as a #Mapper.
My controller method:
#RestController
#RequestMapping("/api/myAPI")
public class MyController{
#Resource
private MyService service;
#GetMapping(value = "myString")
public String getBillingCompany() {
return this.service.getDAO().getMyString();
}
}
My DAO:
#Repository
#Mapper
public interface MyDAO{
String getMyString();
}
...and the MyBatis mapper:
<mapper namespace="com.package.MyDAO">
<select id="getMyString" resultType="String">
SELECT 'My desired result' FROM A_TABLE
</select>
...
</mapper>
The HttpErrorResponse:
HttpErrorResponse: {
"headers": {
"normalizedNames": {},
"lazyUpdate": null
},
"status": 200,
"statusText": "OK",
"url": "http://localhost:4200/api/myAPI/myString",
"ok": false,
"name": "HttpErrorResponse",
"message": "Http failure during parsing for http://localhost:4200/api/myAPI/myString",
"error": {
"error": { SyntaxError: Unexpected number in JSON at position 2
at JSON.parse (<anonymous>)
at XMLHttpRequest.onLoad (http://localhost:4200/vendor.js:18158:51)
at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invokeTask (http://localhost:4200/polyfills.js:21266:35)
at Object.onInvokeTask (http://localhost:4200/vendor.js:74037:33)
at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invokeTask (http://localhost:4200/polyfills.js:21265:40)
at Zone.push../node_modules/zone.js/dist/zone.js.Zone.runTask (http://localhost:4200/polyfills.js:21033:51)
at ZoneTask.push../node_modules/zone.js/dist/zone.js.ZoneTask.invokeTask [as invoke] (http://localhost:4200/polyfills.js:21348:38)
at invokeTask (http://localhost:4200/polyfills.js:22516:18)
at XMLHttpRequest.globalZoneAwareCallback (http://localhost:4200/polyfills.js:22553:25)
},
"text": "My desired result"
}
}
Nonetheless, if I ask the controller and the DAO methods to return an int, it all works flawlessly.
Due to this, I suspected that the issue has to do with non-primitive types "namespacing", so I've tried to set a typeAlias in the MyBatis configuration, to no avail:
<?xml version="1.0" encoding="UTF-8" ?><!DOCTYPE configuration
PUBLIC "-//mybatis.org//DTD Config 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-config.dtd">
<configuration>
<typeAliases>
<typeAlias type="java.lang.String" alias="String"/>
</typeAliases>
</configuration>
Anyways, I'm under the impression that both MyBatis and Spring should be already smart enough to know what a String is. I've successfully returned collections of objects (Maps and Lists) and POJOs in the past.
Any ideas on what I'm lacking or not seeing? Thanks in advance.
Edit:
The only thing that has worked for me so far is similar to what #emeraldjava proposed. I've built a wrapper upon an existing one in a dependency, fetching the data in my Front:
#RestController
#RequestMapping("/api/myAPI")
public class MyController{
#Resource
private MyService service;
#GetMapping(value = "myString")
public Result<String> getBillingCompany() {
return new Result<>(this.service.getDAO().getMyString());
}
}
public class Result<T> extends ServiceResult {
public Result(T data) {
this.setData(data);
}
}
The already existing wrapper in a dependency:
public class ServiceResult {
private Object data;
...
public void setData(Object data) {
this.data = data;
}
public Object getData() {
return this.data;
}
}
I'd suggest you update your Controller method to a ResponseEntity which wraps the string.
#RestController
#RequestMapping("/api/myAPI")
public class MyController{
#Resource
private MyService service;
#GetMapping(value = "myString")
public ResponseEntity getBillingCompany() {
return new ResponseEntity<Object>(this.service.getDAO().getMyString(), HttpStatus.OK);
}
}
I think the problem is that the table you're querying has multiple rows, so the result of your query will be a List of string, containing "My desired result" on each element, in which the size of the list is equal to the number of rows in the table. In order to force a single result, just change the query to:
<select id="getMyString" resultType="String">
SELECT MAX('My desired result') FROM A_TABLE
</select>
I am trying to implement ehcache to get static data (from table) loaded during application startup however when I make a call again to database, the call is going to database (can see running sql on console) instead of taking values from ehcache.
my code is:
ehcache.xml as below:
<?xml version="1.0" encoding="UTF-8"?>
<ehcache xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="ehcache.xsd"
updateCheck="true"
monitoring="autodetect"
dynamicConfig="true">
<diskStore path="java.io.tmpdir" />
<cache name="ObjectList"
maxEntriesLocalHeap="10000"
maxEntriesLocalDisk="1000"
eternal="false"
diskSpoolBufferSizeMB="20"
timeToIdleSeconds="2000000" timeToLiveSeconds="900000000000"
memoryStoreEvictionPolicy="LFU"
transactionalMode="off">
<persistence strategy="localTempSwap" />
</cache>
</ehcache>
my repository class is:
public interface customRepository extends JpaRepository<Object, Long> {
#Cacheable(value = "ObjectList", cacheManager="abclCacheManager")
public Object findById(Long id);
#Cacheable(value = "ObjectList", cacheManager="abclCacheManager")
public List<Object> findAll();
}
and my cacheInitialiser class is:
#Configuration
#EnableCaching
#ComponentScan("com.abcl.process")
public class EhCacheConfiguration {
#Bean("abclCacheManager")
public CacheManager cacheManager() {
return new EhCacheCacheManager(ehCacheCacheManager().getObject());
}
#Bean
public EhCacheManagerFactoryBean ehCacheCacheManager() {
EhCacheManagerFactoryBean cmfb = new EhCacheManagerFactoryBean();
cmfb.setConfigLocation(new ClassPathResource("ehcache.xml"));
cmfb.setShared(true);
cmfb.setCacheManagerName("abclCacheManager");
return cmfb;
}
}
I am testing my this using below:
public class testCache {
doSomething() {
List<Object> listObject = repo.findAll();
listObject.size();
}
public void getOne() {
Object o = repo.findById(1L);
}
}
I can see a db hit in getAll method however I thought the results would get stored in cache and in the second call there would not be a db hit by method getById however I see a db hit on second call as well.
Can anyone please suggest if I am missing anything here.
When you cache the results of findAll it creates a single entry in the cache which maps the key generated by Spring caching, since your method has no parameter, to the List<Object>. It does not put into the cache one mapping per list element between id and the Object.
So when you use findById(Long), Spring caching will look for a cache entry mapping to the id. And since it cannot find one, it will hit the database.
There is no way of having Spring caching put one mapping per collection element. If that is really what you need, you will have to code it instead of relying on the #Cacheable annotation.
It appears that the update for mongoOperations do not trigger the events in AbstractMongoEventListener.
This post indicates that was at least the case in Nov 2014
Is there currently any way to listen to update events like below? This seems to be quite a big omission if it is the case.
MongoTemplate.updateMulti()
Thanks!
This is no oversight. Events are designed around the lifecycle of a domain object or a document at least, which means they usually contain an instance of the domain object you're interested in.
Updates on the other hand are completely handled in the database. So there are no documents or even domain objects handled in MongoTemplate. Consider this basically the same way JPA #EntityListeners are only triggered for entities that are loaded into the persistence context in the first place, but not triggered when a query is executed as the execution of the query is happening in the database.
I know it's too late to answer this Question, I have the same situation with MongoTemplate.findAndModify method and the reason I needed events is for Auditing purpose. here is what i did.
1.EventPublisher (which is ofc MongoTemplate's methods)
public class CustomMongoTemplate extends MongoTemplate {
private ApplicationEventPublisher applicationEventPublisher;
#Autowired
public void setApplicationEventPublisher(ApplicationEventPublisher
applicationEventPublisher) {
this.applicationEventPublisher = applicationEventPublisher;
}
//Default Constructor here
#Override
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
T result = super.findAndModify(query, update, entityClass);
//Publishing Custom Event on findAndModify
if(result!=null && result instanceof Parent)//All of my Domain class extends Parent
this.applicationEventPublisher.publishEvent(new AfterFindAndModify
(this,((Parent)result).getId(),
result.getClass().toString())
);
return result;
} }
2.Application Event
public class AfterFindAndModify extends ApplicationEvent {
private DocumentAuditLog documentAuditLog;
public AfterFindAndModify(Object source, String documentId,
String documentObject) {
super(source);
this.documentAuditLog = new DocumentAuditLog(documentId,
documentObject,new Date(),"UPDATE");
}
public DocumentAuditLog getDocumentAuditLog() {
return documentAuditLog;
}
}
3.Application Listener
public class FindandUpdateMongoEventListner implements ApplicationListener<AfterFindAndModify> {
#Autowired
MongoOperations mongoOperations;
#Override
public void onApplicationEvent(AfterFindAndModify event) {
mongoOperations.save(event.getDocumentAuditLog());
}
}
and then
#Configuration
#EnableMongoRepositories(basePackages = "my.pkg")
#ComponentScan(basePackages = {"my.pkg"})
public class MongoConfig extends AbstractMongoConfiguration {
//.....
#Bean
public FindandUpdateMongoEventListner findandUpdateMongoEventListner(){
return new FindandUpdateMongoEventListner();
}
}
You can listen to database changes, even the changes completely outside your program (MongoDB 4.2 and newer).
(code is in kotlin language. same for java)
#Autowired private lateinit var op: MongoTemplate
#PostConstruct
fun listenOnExternalChanges() {
Thread {
op.getCollection("Item").watch().onEach {
if(it.updateDescription.updatedFields.containsKey("name")) {
println("name changed on a document: ${it.updateDescription.updatedFields["name"]}")
}
}
}.start()
}
This code only works when replication is enabled. You can enable it even when you have a single node:
Add the following replica set details to mongodb.conf (/etc/mongodb.conf or /usr/local/etc/mongod.conf or C:\Program Files\MongoDB\Server\4.0\bin\mongod.cfg) file
replication:
replSetName: "local"
Restart mongo service, Then open mongo console and run this command:
rs.initiate()
I have a Listener with #PostPersist method called "doAudit" to audit create action.On debugging I see This method gets called and Audit record is created on JAVA side. But when I verify the DB I don't see the record.
public class AuditListener {
#PostPersist
public void doAudit(Object object) {
AuditDao auditManager = AuditDaoImpl.getInstance();
auditManager.logEvent("create", object);
}
}
public interface AuditDao {
#Transactional(propagation= Propagation.REQUIRED)
public AuditEntity logEvent(String action, Object object);
}
#Component
public class AuditDaoImpl implements AuditDao {
private static AuditDaoImpl me;
public AuditDaoImpl() {
me = this;
}
public static AuditDaoImpl getInstance() {
return me;
}
#Autowired
private AuditDao dao;
#Transactional
#Override
public AuditEntity logEvent(String action, Object object) {
AuditEntity act = new AuditEntity();
act.setAction(action);
act.setObject(object);
dao.create(act);
return act;
}
}
I am using Open JPA 2.0 as for my ORM. Deployed on karaf container. I am using Postgres SQL as my backend.
Add a debug breakpoint and check if the current stack-trace contains the TransactionInterceptor. If there's no such entry, the Spring transaction management is not properly configured and your DAOs don't use transactions at all.
JPA allows you to run queries without configuring transactions explicitly. For saving data, transactions are mandatory.
I'm using Spring data jpa & hibernate for data access along with Spring boot. All the repository beans are singleton by default. I want to define the scope of all my repositories to Prototype. How can I do that?
#Repository
public interface CustomerRepository extends CrudRepository<Customer, Long> {
List<Customer> findByLastName(String lastName);
}
Edit 1
The problem is related to domain object being shared in 2 different transactions which is causing my code to fail. I thought it is happening because repository beans are singleton. That's the reason I asked the question. Here is the detailed explanation of the scenario.
I have 2 entities User and UserSkill. User has 1-* relationship with UserSkills with lazy loading enabled on UserSkill relation.
In a UserAggregationService, I first make a call to fetch an individual user skill by id 123 which belongs to user with id 1.
public class UserAggregationService {
public List<Object> getAggregatedResults() {
resultList.add(userSkillService.getUserSkill(123));
//Throws Null Pointer Exception. See below for more details.
resultList.add(userService.get(1));
}
}
Implementation of UserSkillService method looks like
#Override
public UserSkillDTO getUserSkill(String id) {
UserSkill userSkill = userSkillService.get(id);
//Skills set to null avoid recursive DTO mapping. Dozer mapper is used
//for mapping.
userSkill.getUser().setSkills(null);
UserSkillDTO result = mapper.map(userSkill, UserSkillDTO.class);
return result;
}
In the call of user aggregation service, I call UserService to fetch userDetails. UserService code looks like
#Override
public UserDTO getById(String id) {
User user = userService.getByGuid(id);
List<UserSkillDTO> userSkillList = Lists.newArrayList();
//user.getSkills throws null pointer exception.
for (UserSkill uSkill : user.getSkills()) {
//Code emitted
}
....
//code removed for conciseness
return userDTO;
}
UserSkillService method implementation
public class UserSkillService {
#Override
#Transactional(propagation = Propagation.SUPPORTS)
public UserSkill get(String guid) throws PostNotFoundException {
UserSkill skill = userSkillRepository.findByGuid(guid);
if (skill == null) {
throw new SkillNotFoundException(guid);
}
return skill;
}
}
UserService method implementation:
public class UserService {
#Override
#Transactional(readOnly = true)
public User getByGuid(String guid) throws UserNotFoundException {
User user = userRepo.findByGuid(guid);
if (user == null) {
throw new UserNotFoundException(guid);
}
return user;
}
}
Spring boot auto configuration is used to instantiate entity manager factory and transaction manager. In the configuration file spring.jpa.* keys are used to connect to the database.
If I comment the below line of code, then I do not get the exception. I am unable to understand why change in the domain object is being affecting the object fetch in a different transaction.
userSkill.getUser().setSkills(null);
Please suggest If I have missed something.