Spring framework. How to store some data to prevent each time access db - spring

I have in my application list of customers and users. I would like get the list of them only on start. Then use data that is stored locally, not from DB.
Can You advice me some methods with examples?
I think about HttpSession object? But I am not sure is it ok?
Cause this data should be available only for logged user, that access it on start.
List of customers will be available on each page off application!

Take a look at Spring cache http://docs.spring.io/spring/docs/current/spring-framework-reference/html/cache.html
You can annotate your repository methods:
#Cacheable(value="customer", key="#prsonalNum")
public Customer findCustomer(String prsonalNum) {
...
}
The application will enter the method body only the first time. After that the value will be taken from the cache.
You can also evict the cache when you update some customer for example.
#CacheEvict(value="customer", allEntries=true)
public void addCustomer(Customer cust)

For synchronizing insert and update operations with the cache use #CachePut annotation and for synchronizing delete operations with the cache use #CacheEvict annotation.
Use the same cache name (value paramter) and same key value
And you should enable caching with #EnableCaching annotation on one of your configuration classes.

You can still use the HttpSession object, however, only put it in the session once, the user has logged in... and you can remove it from the session on page close...

If you want to reuse the data multiple times and do not want to query each time. i would suggested to create a simple Cache suing HashMap or HashTable.
You can easily save the list of customers across every id in such data structure.
and in spring you can easily create a singleton bean which will hold this hashmap and accessible across the application.

Related

#cachePut for data update in list of object in spring boot

Hi I am using Spring Boot Cache in my application. I am able to fetch data from db and cached that data.
#Cacheable("employee")
public Optional<List<Employee>> employeeData(){
log.info("Fetched employee details from DB and cached in memory!!");
return employeeRepository.findActiveEmployee();
}
I want to delete or update or add new record in cached object.
How can I use #cachePut to update existing record or insert record or delete existing record based on some condition.
You can keep your original method as is to call it whenever you want to fetch data from cache. Note that the #Cacheable annotation does not execute the method's body if the cache with name "employee" is not empty, instead it returns the results from cache.
#Cacheable("employee")
public Optional<List<Employee>> employeeData(){
}
Then proceed in creating a new method annotated with #CachePut. Having in mind that #CachePut annotation will both execute the method as well as cache the results each and every time:
#CachePut(value="employee", condition="#name=='Tom'")
public Optional<List<Employee>> employeeDataCacheByName(String name){
log.info("Fetched employee details from DB and cached in memory depending on condition!!");
return employeeRepository.findActiveEmployee();
}
The above method will run the query every time and put into employee cache the results if the name argument is "Tom" (condition logic is up to you, this is just an example). This way cache is always updated with the results from the database (as long as the condition is truly evaluated).
For deleting (I don't think #CachePut can be used) maybe you can combine the #CacheEvict annotation, you can see an example in this Answer: https://stackoverflow.com/a/62488344/3635454

Can I commit a portion of an #Transactional sequence?

I have a Spring Boot application, and have a webservice where a user can POST a model of a CollegeCourse instance which includes links between that class and the Students who are taking it. (The data is used to store rows in the association table, since those classes have a many-to-many relationship.) This works fine.
Say the enrollment in the course changes. The User expects to send the same JSON structure to the webservice handling the PUT call. The code took the easy path for updating, first finding and deleting all the existing CollegeCourse-Student links, then saving the new links. (Rather than iterating through the two lists, matching up items.) This part worked also as given.
We then added a uniqueness constraint to the CollegeCourse-Student association table, so that said table could not have a single Student linked to one CollegeCourse multiple times. This crashed and burned. A debugging session revealed the culprit: the delete of the CollegeCourse-Student records did not actually remove them from the database until the transaction completed. Thus, when we tried to add the new links back in, any holdovers from the original POST conflicted with what was already in the database.
The service handling the PUT is preceded by a #Transactional annotation. I tried moving the code to find and delete the associations in a separate method, and tried both #Transactional(propagation=Propagation.REQUIRED) and REQUIRES_NEW, but neither prevented failing the uniqueness constraint. I also added #EnableTransactionManagement to my Application class - same story. Is there a simple solution to my dilemma?
Without knowing exactly what your repository looks like, have you tried to do a manual flush on the entity manager after the deletions?
Something along the lines of
entityManager.flush();
Or, if you're using a Spring Data JPA repository, you should be able to define a flush method in that interface and call it.

How to use Hazelcast with MapStore

I am using Hazelcast as caching Solution for my application.
My application has few inserts and updates to the database and these needs to be synced to Cache also.
I want to use MapStore functionality so that when I do IMap.put(), Hazelcast takes care of persisting the Object in underlying Db and also update its cache.
In the overridden store implementation, I want to call my DAO in following way to persist the Data.
public void store(Long key, Product value)
{
log.info("Storing Data for Employee {} in Database using DataStore ", value);
Long employeeId = employeeDao.create(value);
value.setId(employeeId );
}
There are few issues listed below:-
1) In put call, I want to use "key" as the "employeeId", but this is generated only after insertion happens for this record in the Db. So how do I put into the Cache when I don't have the Id.? I want Hazelcast to use the "id" generated as part of store method call (or any other way) as the key to my Object.
Imap.put(key,new Employee("name_of_Employee","age_of_employee"))
2) The MapStore implementation's store method returns a void so I cannot return the Id generated for this Object to the Client. How can I achieve this?
I tried using MapEntryListeners on the Map but the entry added callback does not return new Object. I also added PostProcessingMapStore interface to my MapStore but could not get the new Value back to client.
Please advice
You have 2 options:
1) Generate the employeeId outside of the database. You can use the IdGenerator from Hazelcast to do this.
2) If you must let the database generate the id, then you need to put the Employee in the cache manually AFTER it has been stored in the database.

Hibernate to initialize object in a different transaction

I got the famous LazyInitializationException.
I have an object User which is stored in the session. This object contains an other object Market which is lazy initialized.
When I load the user in the session, I don't load Market because it is too heavy and I don't need it everytime.
When I want to load the market, I am in a different transaction and I don't want to reload the user from the database. How can I retrieve the Market object? Knowing that User.market contains the Hibernate proxy and so the id of the market and that I don't want to hack Hibernate using reflection.
That would be even better if I could load the market without loading it into the user. Since the user is in the session, I don't want to put a lot of stuff in the session.
A JPA compatible solution would be even better.
Cheers
If the eager mode fetching is not acceptable, and if the transaction cannot be maintained up to the Market retrieval, a specific dao method could be implemented to retrieve specifically the market from a user
public List<Market> retrieveMarketFromUser (final User user) {
Query query = session.createQuery("SELECT m FROM User AS u INNER JOIN u.market as m WHERE u.userid = :uid");
query.setParameter("uid", user.getId());
List<Market> list = query.list();
return list;
}
or the short version
Query query = session.createQuery("SELECT u.market FROM User AS u WHERE u.userid = :uid");
This is maybe not the JPA solution you were expecting, just a workaround.
What you have to do is to annotate the accessors instead of the fields. This will allow you to avoid loading of the market object when you initially load the user, but you will have access to the id of the market from the Hibernate proxy object without triggering a lazy loading or getting a LazyInitializationException. Then later on when you want to load the market itself you do a normal entity retrieval based on its id. You can read a detailed explanation of how this works here.
If the relation is bidirectional then you can load the Market independently using a query with clause like where market.user.id = ?.
Why is the obvious solution not good? Like the one ring0 suggested, or simply using the findById or find methods if you already have the id in the User object.
//if using a session factory
session.findById(Market.class, marketId);
//if using the EntityManager
em.find(Market.class, marketId);
Depending on how the current_session_context_class configured (I have it in hibernate.cfg.xml) you might have a new session with each new transaction. If that is the case the you do not need to worry about putting too much stuff in there. You can find more info on contextual sessions here.
Since you have mentioned new transaction , then you would probably working with a new hibernate session belonging to thread ,
Use this incase of direct interaction with hibernate session
obj = session.merge(obj);
In case your using JPA2 Api
obj= entityManager.merge(obj);
Please rate the answer if it helps.
Cheers

how can i update an object/entity that is not completely filled out?

I have an entity with several fields, but on one view i want to only edit one of the fields. for example... I have a user entity, user has, id, name, address, username, pwd, and so on. on one of the views i want to be able to change the pwd(and only the pwd). so the view only knows of the id and sends the pwd. I want to update my entity without loading the rest of the fields(there are many many more) and changing the one pwd field and then saving them ALL back to the database. has anyone tried this. or know where i can look. all help is greatly appreciated.
Thx in advance.
PS
i should have given more detail. im using hibernate, roo is creating my entities. I agree that each view should have its own entity, problem is, im only building controllers, everything was done before. we were finders from the service layer, but we wanted to use some other finders, they seemed to not be accessible through the service layer, the decision was made to blow away the service layer and just interact with the entities directly (through the finders), the UserService.update(user) is no longer an option. i have recently found a User.persist() and a User.merge(), does the merge update all the fields on the object or only the ones that are not null, or if i want one to now be null how would it know the difference?
Which technologies except Spring are you using?
First of all have separate DTOs for every view, stripped only to what's needed. One DTO for id+password, another for address data, etc. Remember that DTOs can inherit from each other, so you can avoid duplication. And never pass business/ORM entities directly to view. It is too risky, leaks in some frameworks might allow users to modify fields which you haven't intended.
After the DTO comes back from the view (most web frameworks work like this) simply load the whole entity and fill only the fields that are present in the DTO.
But it seems like it's the persistence that is troubling you. Assuming you are using Hibernate, you can take advantage of dynamic-update setting:
dynamic-update (optional - defaults to false): specifies that UPDATE SQL should be generated at runtime and can contain only those columns whose values have changed.
In this case you are still loading the whole entity into memory, but Hibernate will generate as small UPDATE as possible, including only modified (dirty) fields.
Another approach is to have separate entities for each use-case/view. So you'll have an entity with only id and password, entity with only address data, etc. All of them are mapped to the same table, but to different subset of columns. This easily becomes a mess and should be treated as a last resort.
See the hibernate reference here
For persist()
persist() makes a transient instance persistent. However, it does not guarantee that the
identifier value will be assigned to the persistent instance immediately, the assignment
might happen at flush time. persist() also guarantees that it will not execute an INSERT
statement if it is called outside of transaction boundaries. This is useful in long-running
conversations with an extended Session/persistence context.
For merge
if there is a persistent instance with the same identifier currently associated with the session, copy the state of the given object onto the persistent instance
if there is no persistent instance currently associated with the session, try to load it from the database, or create a new persistent instance
the persistent instance is returned
the given instance does not become associated with the session, it remains detached
persist() and merge() has nothing to do with the fact that the columns are modified or not .Use dynamic-update as #Tomasz Nurkiewicz has suggested for saving only the modified columns .Use dynamic-insert for inserting not null columns .
Some JPA providers such as EclipseLink support fetch groups. So you can load a partial instance and update it.
See,
http://wiki.eclipse.org/EclipseLink/Examples/JPA/AttributeGroup

Resources