Hibernate will not persist data after save - spring

Can someone explain why the "lastAccessed" date does not get saved to the database in this example and how I can get it to save to the DB? My understanding is that the do object is an attached object after the save() call and therefore all modifications should be persisted automatically.
Note: "myDate" is persisted correctly, so all other spring configuration seems to be correct.
#Transactional(readOnly = false)
public DateObject getOrCreateDateObject(Date myDate) {
DateObject do = null;
do = getCurrentDateObject(); // For my tests, this has been returning null
if (do == null) {
// create a new object
do = new DateObject();
do.setDate(myDate);
sessionFactory.getCurrentSession().save(do);
}
// This does not persist to the database
do.setLastAccessed(new Date());
return do;
}
I have also tried some of the following combinations (and more) after the save() call. None of these work:
sessionFactory.getCurrentSession().merge(do); // tried before and after do.setDate(d2)
sessionFactory.getCurrentSession().update(do);
sessionFactory.getCurrentSession().saveOrUpdate(do);
sessionFactory.getCurrentSession().flush();
DateObject doCopy = (DateObject)sessionFactory.getCurrentSession().load(DateObject.class, do.getId());
sessionFactory.getCurrentSession().merge(doCopy);
doCopy.setLastAccessed(new Date());
I'm hoping this is an easy answer that I'm just not seeing. Thank you for your help!
Edit #1 05/22/2012
As requested, here is the mapping for this entity, specified in src/main/resources/META-INF/dateobject.hbm.xml. I can see that the columns are created in the database using "SELECT * FROM dateObjects" in the mysql client. MY_DATE is populated correctly, but LAST_ACCESSED is set to NULL.
<class name="com.example.entity.DateObject" table="dateObjects">
<id name="id" column="DATE_OBJECT_ID">
<generator class="identity" />
</id>
<property name="date" type="date" column="MY_DATE" />
<property name="lastAccessed" type="date" column="LAST_ACCESSED" />
</class>
Edit #2 05/24/2012
I have a working SSCCE at https://github.com/eschmidt/dateobject. The interesting thing is that the web client (calling localhost:8080/view/test) shows that lastAccessed is set correctly, but when I check the database with the MySQL client, it shows that lastAccessed is NULL. With this complete set of code, can anybody see why the database wouldn't update even though the method is marked #Transactional?

If you're absolutely certain that after running that code, do.date is stored in the db and do.lastAccessed isn't, then your connection and transaction are obviously set up correctly. My first guess would be incorrect mappings, since that's the simplest solution. You don't happen to have an #Transient on the field, the getter, or the setter for lastAccessed, do you? (Assuming, of course, that you're using annotations to map your domain objects.)
If you could provide an SSCCE, I'll bet I or someone else can give you a definitive answer.
Update: It's hard trimming a full application down to the smallest possible code that demonstrates a problem. The upshot is that you'll likely find the answer while you're at it. I have lots of sample projects in github that might help guide you if you just need a few nudges in the right direction. basic-springmvc might be closest to what you're doing, but it uses annotations instead of xml for mappings. It's also a Spring MVC project. It's a lot simpler to start a Spring context manually in a main class than to worry about a whole servlet container and the multiple contexts that Spring MVC wants you to have. spring-method-caching, for one, has an example of doing that.
As for the mapping you posted, it looks fine, though it's been a long while since I touched an XML mapping. Are you using field or property access? That could possibly have a bearing on things. Also, are there any custom listeners or interceptors in the SessionFactory that might be twiddling with your objects?

You are using IDENTITY generation for your identifier generation strategy, so the save() call here immediately translates to the insert. Do you see any INSERT/UPDATE/DELETE SQL executed after that? If not, it is most likely that the session is just not being flushed. flushing might happen at a number of points, read the docs on flushing if you are unfamiliar.

Related

Spring Data problem - derived delete doesn't work

I have a spring boot application (based off spring-boot-starter-data-jpa. I have an absolute minimum of configuration going on, and only a single table and entity.
I'm using CrudRepository<Long, MyEntity> with a couple of findBy methods which all work. And I have a derived deleteBy method - which doesn't work. The signature is simply:
public interface MyEntityRepository<Long, MyEntity> extends CrudRespository<> {
Long deleteBySystemId(String systemId);
// findBy methods left out
}
The entity is simple, too:
#Entity #Table(name="MyEntityTable")
public class MyEntity {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
#Column(name="MyEntityPID")
private Long MyEntityPID;
#Column(name="SystemId")
private String systemId;
#Column(name="PersonIdentifier")
private String personIdentifier;
// Getters and setters here, also hashCode & equals.
}
The reason the deleteBy method isn't working is because it seems to only issue a "select" statement to the database, which selects all the MyEntity rows which has a SystemId with the value I specify. Using my mysql global log I have captured the actual, physical sql and issued it manually on the database, and verified that it returns a large number of rows.
So Spring, or rather Hibernate, is trying to select the rows it has to delete, but it never actually issues a DELETE FROM statement.
According to a note on Baeldung this select statement is normal, in the sense that Hibernate will first select all rows that it intends to delete, then issue delete statements for each of them.
Does anyone know why this derived deleteBy method would not be working? I have #TransactionManagementEnabled on my #Configuration, and the method calling is #Transactional. The mysql log shows that spring sets autocommit=0 so it seems like transactions are properly enabled.
I have worked around this issue by manually annotating the derived delete method this way:
public interface MyEntityRepository<Long, MyEntity> extends CrudRespository<> {
#Modifying
#Query("DELETE FROM MyEntity m where m.systemId=:systemId")
Long deleteBySystemId(#Param("systemId") String systemId);
// findBy methods left out
}
This works. Including transactions. But this just shouldn't have to be, I shouldn't need to add that Query annotation.
Here is a person who has the exact same problem as I do. However the Spring developers were quick to wash their hands and write it off as a Hibernate problem so no solution or explanation to be found there.
Oh, for reference I'm using Spring Boot 2.2.9.
tl;dr
It's all in the reference documentation. That's the way JPA works. (Me rubbing hands washing.)
Details
The two methods do two different things: Long deleteBySystemId(String systemId); loads the entity by the given constraints and ends up issuing EntityManager.delete(…) which the persistence provider is about to delay until transaction commits. I.e. code following that call is not guaranteed that the changes have already been synced to the database. That in turn is due to JPA allowing its implementations to actually do just that. Unfortunately that's nothing Spring Data can fix on top of that. (More rubbing, more washing, plus a bit of soap.)
The reference documentation justifies that behavior with the need for the EntityManager (again a JPA abstraction, not something Spring Data has anything to do with) to trigger lifecycle events like #PreDelete etc. which users expect to fire.
The second method declaring a modifying query manually is declaring a query to be executed in the database, which means that entity lifecycles do not fire as the entities do not get materialized upfront.
However the Spring developers were quick to wash their hands and write it off as a Hibernate problem so no solution or explanation to be found there.
There's detailed explanation why it works the way it works in the comments to the ticket. There are solutions provided even. Workarounds and suggestions to bring this up with the part of the stack that has control over this behavior. (Shuts faucet, reaches for a towel.)

Can NHibernate load data from 2nd level cache after session is closed?

I am trying stuff I can make with NHibernate for my app. I am using quite a lot of "dictionaries" to store all possible values for certain object properties. I've tried playing with 2nd level cache to store those dictionaries data in there. Now I wonder if there is a way to load needed data from cache after the session is closed. Let's say that is my code:
public class Class1 {
public virtual int Id { get; set; }
public virtual Dic1 Dic { get; set; }
}
public class Dic1 {
public virtual int Id { get; set; }
public virtual string Name { get; set; }
}
and here are the mappings:
<class name="Class1" table="class1">
<id name="Id" column="id">
<generator class="native" />
</id>
<!-- I want to try not to use fetch="join" here -->
<many-to-one
name="Dic"
class="Dic1"
column="dic1_id"
/>
</class>
<class name="Dic1" table="dic1">
<cache usage="read-write"/>
<id name="Id" column="id">
<generator class="native" />
</id>
<property name="Name" column="name" />
</class>
If I get value of Class1.Dic object before I close the session, NHibernate does not send query to the database, because value was cached by some earlier query.
But let's say, I've closed the session. In debug session, Class1.Dic is an object of Dic1Proxy type and I get an exception when I try to access it/it's properties. Is there a way to load that data after session is closed? 2nd level cache is connected to session factory, so maybe there is a way to actually turn that Proxy into the right object? Or actually force to always load those values without changing fetch method to join.
You can use NHibernateUtil.Initialize(class1.Dic); before closing the session. It will do nothing if the object is not actually a proxy or if it is already loaded, otherwise it will load it (from the second level cache if cached).
You can also force eager fetching while keeping the default select fetch mode: set lazy as false on the many-to-one mapping of the Dic property. Loading a Class1 will then trigger immediately a load of the Dic property. It should fetch it from second level cache if it is in it. Beware that this will cause n+1 loads issues if you query a list of Class1 and if their Dic property is not cached, even if you have enabled batching of lazy loads.
Otherwise if you do not want to do any kind of manipulation on your Dic property before closing the session, you need to change the proxy implementation for it to first check the second level cache before failing if the session is already closed. But this will require too much work to be worth it, in my opinion. (Moreover, what if the entity is missing in the cache anyway? Is it acceptable for your application to fail in such case?)
NHibernate allows you to supply your own factory of proxy factory (IProxyFactoryFactory), by using the proxyfactory.factory_class optional setting, provided you use the default bytecode provider.
Then you would need to implement your own IProxyFactoryFactory, which would likely mostly be a copy of StaticProxyFactoryFactory with BuildProxyFactory yielding a custom proxy factory.
The custom proxy factory would itself likely be mostly a copy of StaticProxyFactory, with GetProxy using a custom ILazyInitializer at this line.
The custom lazy initializer would in turn be likely a copy of LiteLazyInitializer, but with an override of Initialize. Its implementation is here.
That is for the easy part, and till there, that is not as bad as it sounds, that does not involve copying so many lines of code.
Now for the override of Initialize, it would need to check the Session property and acts accordingly, calling its base implementation if the session is usable, or trying to directly load from the second level cache otherwise.
Here you would have more code to duplicate, mainly LoadFromSecondLevelCache and AssembleCacheEntry.
You will also need the persister, which is easy to get if you have the session factory: sessionFactory.GetEntityPersister(EntityName). (ILazyInitializer has an EntityName property.)
As you can see by checking their code, these functions use at many points the session:
CacheMode: checked for ensuring the cache is enabled for the session, you surely should skip that check for your use case.
GenerateCacheKey: easy to inline out of the session, see its code.
Timestamp: using sessionFactory.Settings.CacheProvider.NextTimestamp() instead should do it.
Instantiate: use subclassPersister.Instantiate(id) instead. (Unless you have an interceptor to which it should be delegated.)
The other calls are more troublesome.
You will have to give-up on safe-keeping from circular references as this uses the session persistence context.
Then there is the Assemble and DeepCopy logic which use the session. Many cases just call its Factory property, so depending on the property types of your entities, a dummy session just actually supplying the factory may do.
Skip the readonly stuff if possible, otherwise you will have still more work to do.
Skip most of the persistenceContext stuff: that is the session first level cache. Still there is the InitializeNonLazyCollections call which will be lacking if your entities have some.
About AfterInitialize, this call is currently needed for handling entities having lazy properties (not being an entity or a collection). So you may be able to skip it.
And finally, the PostLoadEvent is there for entities implementing ILifecycle: again, you may be able to skip it, provided you do not use ILifecycle.
If you have also some cached collections to retrieve from the second level cache without a session, you would need to do a similar work with the collection type factory, configurable with the collectiontype.factory_class, providing your own ICollectionTypeFactory yielding collection types overriding Initialize, likewise duplicating the loading from second level cache.
Good luck if you try this.

spring transaction of JdbcTemplate/HibernateTemplate and HibernateDaoSupport/JdbcDaoSupport

How transaction is controlled while using JdbcTemplate/HibernateTemplateand HibernateDaoSupport/JdbcDaoSupport? I used to check source code and didn't find where the transaction is controlled by JdbcTemplate/HibernateTemplate and HibernateDaoSupport/JdbcDaoSupport.
And In source code HibernateDaoSupport/JdbcDaoSupport is using JdbcTemplate/HibernateTemplate, what's the role of HibernateDaoSupport/JdbcDaoSupport and what's the role of JdbcTemplate/HibernateTemplate?
Why do we use JdbcTemplate/HibernateTemplate and HibernateDaoSupport/JdbcDaoSupport? It seems all sample code is using them. What should I use if I don't want to use them, such as only using spring + hibernate?
If I'm using JdbcTemplate/HibernateTemplate and HibernateDaoSupport/JdbcDaoSupport, do I still need to config transaction proxy in xml? If I still need to config transaction proxy in xml, it means it's ok for me to put both getHibernateTemplate().saveOrUpdate(user)and getHibernateTemplate().saveOrUpdate(order) together, and they're invoked in the same transaction, is this right?
First off all please forget about HibernateTemplate and HibernateDaoSupport these classes should be considered deprecated since the release of hibernate 3.0.1 (which was somewhere in 2006!). You should be creating daos/repositories based on a plain hibernate API, as explained in the Spring Reference Guide. (The same goes for JpaTemplate and JpaDaoSupport).
JdbcTemplate (and all other *Template classes) intend is to make it easier to work with the underlying technology. Once upon a time this was also needed for Hibernate (< 3.0.1), now it isn't.
JdbcTemplate makes it easier to work with plain JDBC code. You don't have to get a connection, create a (Prepared)Statement, add the parameters, execute the query, iterate over the resultset and convert the ResultSet. With the JdbcTemplate much of this is hidden and most of it can be written in 1 to 3 lines of code, whereas plain JDBC would require a lot more.
The *Support classes make it easier to gain access to a template but aren't a must to use. Creating a JdbcTemplate is quite easy and you don't really need to extend JdbcDaoSupport. But you can if you want. For more information a lot is explained in the reference guide.

Spring Data JPA save() throws NPE

I wrote a web service with spring boot using spring data jpa for persistence.
The webservice has some static objects (in Singleton Bean) that regulary needs to be backed up to my database.
Sometimes! (This sucks...I dont' really know what happens) when I call
ObjectType updated = myRepository.save(existingObject)
I get an java.lang.NullPointerException - without usable stacktrace as the method doing this is running via #Scheduled.
I tried debugging and existingObject seems to be absolutely fine. The error only occurs, when existingObject is actually NOT a new object (i.e. when id != 0)
P.S. I am using Spring Boot therefore not really using EntityManager. I only use the #Autowired myRepository.
I'm seeing something similar happening. During save, it seems the object is re-fetched from DB (perhaps to see which fields were altered?) but a ManyToOne relationship is not loaded (even though the FetchType is explicitly set to EAGER).
For some reason, a compareTo is called on the relationship. The related object isn't null, but it only has its ID filled in (presumably because that was available in the object that was fetched from the DB). All other fields are null.
When the compareTo then does its stuff, a NullPointerException follows.
As to the actual solution, I don't know yet, as I would have expected the FetchType EAGER to make sure the relationship is loaded. Hopefully this helps someone to further find the root cause.
(I would have added this as a comment as it doesn't actually answer the question, but StackOverflow won't let me due to insufficient reputation...)
You haven't provided enough information. IF that line is where the NullPointerException is occurring, then the only possibilities are that myRepository is null, or existingObject is null. However, it's possible the NullPointerException is happening as a result of something in the save. Wrap the code in a try catch, and log the exception stacktrace to file.
If needed, checkout the logging customization notes here:
http://projects.spring.io/spring-boot/docs/spring-boot/README.html

Dynamic creation of beans in Spring

Is there a way in spring wherein we can read the fields of a bean from the DB table and create a complete bean class - with getters and setters on server startup????
I require this to make my application completely configurable...as in if I have to add a new field in future , all I would require would be adding a field in the db and the bean setters and getters would be available to me.
Thanks
You could try approaches for dynamically registering beans . You could use the BeanDefinitionBuilder for this purpose . See a sample here . But as #Darren says , It's not a wise idea to creak a bean via DB lookup .
1: Improve your accept-rate
2: You might benefit from something like an ORM approach (Hibernate or JPA). Another slightly different approach that might suite you is the Active Record pattern as implemented in, forinstance, ActiveJDBC.
Spring does not, in itself, offer anything like what you are after, but using spring-jpa together with Hibernate might get you a bit closer towards your goal. If, OTOH, you want auto-generated code you could also look at something like Spring-Roo
You might want to think about this a little more. Even if you made your fields totally configurable, you will still have to write the code that accesses them. And given that you are going to have to write code anyway, might as well keep everything in code. It's much simpler that way.

Resources