javax.persistence.EntityNotFoundException: deleted entity passed to persist - spring

I am using spring + JPA as orm framework. My project layer structure is like web --> Service --> Domain DAO --> genericDAO.
In genericDAO I am injecting EntityManager using #PersistenceContext.
genericDAO.delete(Object o) {
o = entityManager.merge(o);
entityManager.remove(o);
}
genericDAO.saveOrUpdate(Object o) {
entityManager.merge(o);
entityManager.flush();
}
In one method in service layer, I have following operations.
// delete order item if already exists.
Order order = getOrderFromSession();
if (CollectionUtils.isNotEmpty(orderItems)) {
Iterator<OrderItem> iterator = orderItems.iterator();
while (iterator.hasNext()) {
OrderItem orderItem = iterator.next();
iterator.remove();
orderDAO.deleteOrderItem(orderItem); // Which internall calls genericDAO.delete()
}
}
//orderDAO.saveOrder(order) // line Y
//Now create fresh order items submitted by jsp form.
for (ProductVO productVO : productList) {
if (productVO.getQuantity() > 0) {
orderItem = new OrderItem();
Product product = productDAO.getProductByCode(productVO.getCode()); // line X
orderItem.populateOrderItemByProduct(product, productVO.getQuantity(), order);
order.addOrderItem(orderItem);
}
}
Line X retrieve product entity using hql. But when line X is executed, I get below error.
javax.persistence.EntityNotFoundException: deleted entity passed to persist: [core.entity.OrderItem#].
I do not understand if order item is already marked as deleted in entity manager, why it tries to persist.
When I uncomment line Y, which internally flush the entity manager, it works fine. I do not understand why it requires entity manager to be flushed before executing line X

Here is a quote from hibernate documentation
Transactional persistent instances (i.e. objects loaded, saved,
created or queried by the Session) can be manipulated by the
application, and any changes to persistent state will be persisted
when the Session is flushed. There is no need to call a particular method (like update(), which has
a different purpose) to make your modifications persistent. The most
straightforward way to update the state of an object is to load() it
and then manipulate it directly while the Session is open.
Sometimes this programming model is
inefficient, as it requires in the same session both an SQL SELECT to
load an object and an SQL UPDATE to persist its updated state.
Hibernate offers an alternate approach by using detached instances.
But I'll try to explain simplier. Your method getOrderFromSession() is transactional and hibernate objects have session open inside it, but when object order is returned to you, it has been detached from session and hibernate doesn't know what you are doing with it, until you persist him again. So for deleted items hibernate will find out when you save that object, until then object in hibernate have same state as it was in a moment when getOrderFromSession() has return it.
Here you have detailed explanation
UPDATE:
When you delete object in hibernate, object in java becomes transient. It still exist in java and after delete you can use it.
Session.delete() will remove an object's state from the database. Your
application, however, can still hold a reference to a deleted object.
It is best to think of delete() as making a persistent instance,
transient.

Related

Data not updated in oracle-db within the running funtion

From a method callAndUpdateInB(), Suppose I am calling update() method of class B(#Component), in which I am calling an myRepository.save() method to update some data in db, and in same funtion I am performing some other calls ... and then return the response back to class A.
So the problem is data gets updated in db when class B method update() return the response back to class A method callAndUpdateInB().
But it should have updated it when I have called myRepository.save() in update method of class B().
Why so ?
For Reference, just see this dummy example
class A{
#Autowired
B b;
public void callAndUpdateInB(String arg){
String data = b.update(arg);
// check Updates in Db (True)
// Now data is updated in db
}
}
#Component
class B{
#Transactional(
propagation = Propagation.REQUIRED
)
public String update(String arg){
MyRepository myRepository; // This is abstract class having
// imlementation for the following
// data. (MyRepositoryImpl)
String updatedData = myRepository.save(arg);
// check Updates in Db (False)
// Making some other calls, which need that updated data
// But data is not still updated in db.
// Though here the updated data field is showing that the data is updated, but it
// is not actually updated in the db.
return updatedData;
}
}
The transaction will be commited to the database if the method update finishes successfully.
Therefore you can't see the data before the method returns.
Additionally save does not execute the insert/update statement. This will also happen before transaction commit.
If you want to execute the statements before you have to call saveAndFlush(). BUT will also not commit the transaction and from another transaction you will not see this data as well.
This is the usual and expected transactional behavior in a Spring application using transactions.
Propagation REQUIRED
Support a current transaction, create a new one if none exists. Analogous to EJB transaction attribute of the same name.
and keeps the transaction uncommitted and alive at the end of the annotated method.
If you call your update() twice at the very beginning of the request processing, the first starts a transaction and the second reuses it. If you call your update() twice, one successfully, the other unsuccessfully (on unique constraints or something), both of the changes will be rolled back.
Developers usually expect a transaction to start and end like that. But in some cases, a change needs to be committed/rolled back independently from other changes. If it is your case, you can use Propagation.REQUIRES_NEW: See
https://stackoverflow.com/a/24341843/12656244

Is double saving a new entity instance with a Spring data 2 JpaRepository correct?

I have two entities in a bi-directional many to many relationship.
A <-> many to many <-> B
I have an endpoint where a client can create an instance of A, and at the same time add some number of B entities to that A, by passing in an array of B entity id keys. Please keep in mind that these B entities already exist in the database. There is no business or software design case for tightly coupling their creation to the creation of A.
So class A looks like this, and B is the same, but with references to A.
#Entity
class A {
#Id
#GeneratedValue
int id;
#ManyToMany
List<B> bs;
String someValue;
int someValue2;
// With some getters and setters omitted for brevity
}
So at first try my endpoint code looks like this.
public A createA(#RequestBody A aToCreate) {
A savedA = aRepository.save(aToCreate);
savedA.getbs().forEach(b -> Service.callWithBValue(b.getImportantValue());
}
And the client would submit a JSON request like this to create a new A which would contain links to B with id 3, and B with id 4.
{
"bs": [{id:3}, {id:10}],
"someValue": "not important",
"someValue2": 1
}
Okay so everything's working fine, I see all the fields deserializing okay, and then I go to save my new A instance using.
aRepository.save(aToCreate);
And that works great... except for the fact that I need all the data associated with the b entity instances, but the A object returned by aRepository.save() has only populated the autofill fields on A, and done nothing with the B entities. They're still just hollow entities who only have their ids set.
Wut.
So I go looking around, and apparently SimpleJpaRepository does this.
#Transactional
public <S extends T> S save(S entity) {
if (entityInformation.isNew(entity)) {
em.persist(entity);
return entity;
} else {
return em.merge(entity);
}
}
And since the A entity is brand new, it only persists the A entity, but it doesn't merge it so I don't get any of the rich B data. So okay, if I modify my code to take this into account I get this.
public A createA(#RequestBody A aToCreate) {
A savedA = aRepository.save(aRepository.save(aToCreate));
savedA.getbs().forEach(b -> Service.callWithBValue(b.getImportantValue());
}
Which works just fine. The second pass through the repository service it merges instead of persists, so the B relationships get hydrated.
My question is: Is this correct, or is there something else I can do that doesn't look so ineloquent and awful?
To be clear this ONLY matters when creating a brand new instance of A, and once A is in the database, this isn't an issue anymore because the SimpleJpaRepository will flow into the em.merge() line of code. Also I have tried different CascadingType annotations on the relationship but none of them are what I want. Cascading is about persisting the state of the parent entity's view of its children, to its children, but what I want to do is hydrate the child entities on new instance creation, instead of having to make two trips to the database.
In the case of a new A, aToCreate and savedA are the same instance because that is what the JPA spec madates:
https://docs.oracle.com/javaee/6/api/javax/persistence/EntityManager.html#persist(java.lang.Object)
Make an instance managed and persistent.
Spring Data simply returns the same instance so persist/merge can be abstracted into one method.
If the B instances you wish to associate with A are existing entities then you need to fetch a reference to these existing instances and set them on A. You can do this without a database hit by using the T getOne(ID id) method of Spring Data's JpaRepository:
https://docs.spring.io/spring-data/jpa/docs/2.1.4.RELEASE/api/
You can do this in your controller or possibly via a custom deserializer.
This is what I ended up going with. This gives the caller the ability to save and hydrate the instance in one call, and explains what the heck is going on. All my Repository instances now extend this base instance.
public interface BaseRepository<T, ID> extends JpaRepository<T, ID> {
/**
* Saves an instance twice so that it's forced to persist AND then merge. This should only be used for new detached entities that need to be saved, and who also have related entities they want data about hydrated into their object.
*/
#Transactional
default T saveAndHydrate(T save) {
return this.save(this.save(save));
}
}

How to not allow lazy loading from outside the transnational method?

I'm using JPA with Hibernate and Spring. I have an entity (Say Employee) with an attribute (Say of type Position) and this attribute is lazy-loaded.
I believe that when you try to access the position attribute, it will be lazy loaded from the DB and this is done inside the transnational method.
Let's say I didn't access the attribute in that transnational method. So if I tried to access it later, I would get "org.hibernate.LazyInitializationException: could not initialize proxy - no Session" which is normal because the session was closed by that transnational method.
At this point, I need it null (or not initialized) wherever I access it later in different method but this is not the case! The question is how can we make it null after committing and closing the session because it is not accessed while the session is open?
Below is a simple code to illustrate the issue.
// In some Service class
#Override
#Transactional(readOnly = true)
public Employee getEmployeeById(Integer id) throws Exception {
Employee emp = employeeDAO.getEmployeeById(id);
// I didn't access the position attribute here because I don't need it for now
return emp;
}
Later I call the above method (Say from some controller):
Employee emp = employeeService.getEmployeeById(904);
System.out.println(emp.getPosition()); // Here, the LazyInitializationException
//would occur, but I need this to be null or at least to prevent the lazy loading,
//thus, avoiding the exception. How?
I think this might be the answer that you're looking for
Hibernate - Avoiding LazyInitializationException - Detach Object From Proxy and Session
Basically
Use Hibernate to check if that field is initialised with Hibernate. isInitialized(fieldName)in the getter and return null if not initialised.
Inside employeeDAO.getEmployeeByIdmethod, create a new Employee object and set the parameters from the one that return from query, which is more work but prevent you to couple your domain to Hibernate.

Spring JPA Update operation

I am working on Spring JPA. As part of it, I have to update an entity ignoring few attributes. The following code is in effort to implement the update operation.
#Transactional
public void updateDMove(DTCRto jsonRto){
//copyProperties(Object source, Object target, String[] ignoreProperties)
DMove dMoveDB = dMoveRepo.findDMove(jsonRto.getLn(), jsonRto.getDriver(), jsonRto.getType());
DMove dMoveRto = jsonRto.convertToDMove(jsonRto);
BeanUtils.copyProperties(dMoveRto,drayMoveDB, new String[] {"moveId", "created","lastchange","locations","status"});
dMoveRepo.save(dMoveDB);
}
DMove : Model class which needs to be updated.
dMoveRepo : respective repository class.
dMoveRto : incoming object.
dMoveDb : object existing in the database.
moveId : is the PK in the DMove class.
Can anyone suggest me what is the way to implement the update operation in Spring JPA ?
Thanks.
detached entity passed to persist means that hibernate doesn't recognize the entity you passed to update, because dMoveDB isn't a persistent object, you lost that when you used this line BeanUtils.copyProperties(dMoveRto,drayMoveDB, new String[] {"moveId", "created","lastchange","locations","status"});
I suggest you remove the moveId so the entity you try to update keeps its orginal primary key and remains as a persistent object.
One last thing, you have to make sure that the object you get from dMoveRepo.findDMove(...) isn't null

Spring #Cacheable with filter

Every entity class has user.id value, I have filters on all services which filters data by principal.id and entity user.id on database level, simply adds where clause. I started to using #Cacheable spring option. But filters not works with spring-cache. How can I filter data from cache ?
#Override
#Cacheable(value = "countries")
public List<Country> getAll() {
return countryDao.findAll();
}
Different user has access to values other users if values are in cache.
From documentation
"As the name implies, #Cacheable is used to demarcate methods that are cacheable - that is, methods for whom the result is stored into the cache so on subsequent invocations (with the same arguments), the value in the cache is returned without having to actually execute the method."
In your case you don't have arguments therefore every time getAll is invoked it will return the cached version.
If your countryDao.findAll() inject the userid at database level, you have an issue as the first user calling countryDao.findAll() will cause his result to be cached, therefore other users will get the same result of the first user.
In general, if I understood how you designed the service, it is common that you don't inject the user at db level but pass it at service level so that this is decoupled from the current session (for example a web request).
However if you want to keep like that, it could still work by doing:
#Cacheable(value = "countries", key="#user.id")
public List<Country> getAll(User user) {
return countryDao.findAll();
}
All you have to do is pass the user to the method even if you don't use it explicitly (but the caching will).

Resources