Why hibernate flushes on select queries (EmptyInterceptor)? - spring

I would like to understand a counter intuitive Hibernate behaviour I am seeing. I always thought that "flush" meant that hibernate had a data structure in-memory that has to be written to the DB. This is not what I am seeing.
I have created the following Interceptor:
public class FeedInterceptor extends EmptyInterceptor
{
#Override
public void postFlush(Iterator entities)
{
System.out.println("postFlush");
}
}
Registered it in my ApplicationContext
<bean id="sessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="entityInterceptor">
<bean class="interceptor.FeedInterceptor"/>
</property>
</bean>
But, strange enough, I see "postFlush" written to the console for every row retrieved from the DB from my DAO:
Session session = sessionFactory.getCurrentSession();
Query query = session.createQuery("from Feed feed");
query.list();
Why is that?

Let's assume Hibernate wouldn't flush the session, then you could have the following situation:
Person p = new Person();
p.setName("Pomario");
dao.create(p);
Person pomario = dao.findPersonByName("Pomario")
//pomario is null?
When finding a person by name, hibernate issues a select statement to the database. If it doesn't send the previous statements first, then the returned result might not be consistent with changes that have been done previously in the session, here the dababase hasn't received the create statement yet, so it returns an empty result set.

Related

Spring Transactions not working with #Transactional

I have following code...
#Transactional (rollbackFor = Exception.class)
public String log(AuditRecord aRecord) throws AuditException {
auditMaster(aRecord); //1. has an audit id and inserts into Audit table
System.out.println(aRecord.getAuditId());
if(true)
throw new AuditException(Error.DUMMY_ERROR);
auditSingleValued(aRecord); //2.
auditMultiValued(aRecord); //3
}
The three auditXXX() Methods in the log function are inserting into different tables. As I have declared #Transactional on the log method I am expecting that the insert on auditMaster() method (at 1.) should be rolled back because of the exception in the if(true) block. But that is not happening, when I check my database it has the new Audit Id entry that the System.out.println is printing.
What can be made to achieve the rollback and make it a Atomic Transaction on the log() method as a whole?
Do I need to declare #Transacional on individual auditXXX methods as well?
My bean configuration has this entry...
<bean id="txManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>

jpa fetching data from cache not database eclipaslink

Hi I am working in a small project where I am using jpa (eclipselink), I am running two same application with from different tomcat, they are using same schema, but if i am
changing value form one application, the other application not fetching data from database,
it always return its local data, I have also tried this #Cacheable(false) in entity class and
<property name="eclipselink.cache.shared.default" value="false" />
<property name="eclipselink.query-results-cache" value="false"/>
in xml file but they are not returning latest data from database, my code is as follows:
EntityManager entityManager = GlobalBean.store.globalEntityManager();
String queryString = "select g from GlobalUrnConf g";
TypedQuery<GlobalUrnConf> globalUrnConfQuery = entityManager.createQuery
(queryString, GlobalUrnConf.class);
return globalUrnConfQuery.getSingleResult();
I am fetching entity manager and factory in following ways -
public EntityManagerFactory factory() {
if (this.entityManagerFactory == null) {
entityManagerFactory = Persistence.createEntityManagerFactory("FileUpload");
}
return this.entityManagerFactory;
}
public EntityManager globalEntityManager() {
if (this.entityManager == null) {
this.entityManager = factory().createEntityManager();
}
return this.entityManager;
}
Please help me. Thanks in advance.
You are caching the EntityManager which is required to cache and hold onto managed entities tht are read through it for both identity purposes and to manage changes. You can get around this by using a refresh query hint, or other query hints so they can avoid the caches, but it is probably better is you just manage the EntityManager's life and cache more directly and only obtain them as needed, or clear them at logical points when returned entities can be released. Try calling EntityManager clear for instance.
return globalUrnConfQuery
.setHint(QueryHints.CACHE_USAGE, CacheUsage.CheckCacheOnly)
.getSingleResult();

How to use low-level driver APIs with Spring Data MongoDB

I am using Spring Data MongoDB. But I don't want to map my result to a domain class. Also, I want to access low level MongoAB APIs in few cases. But I want spring to manage the connections pooling etc.
How can i get an instance of com.mongodb.MongoClient to perform low level operations.
Here is what I am trying to do :
MongoClient mongoClient = new MongoClient();
DB local = mongoClient.getDB("local");
DBCollection oplog = local.getCollection("oplog.$main");
DBCursor lastCursor = oplog.find().sort(new BasicDBObject("$natural", -1)).limit(1);
Or I simply want a JSON object / DBCursor / DBObject.
you can do it this way
#Autowired MongoDbFactory factory;
DB local = factory.getDB("local");
DBCollection oplog = local.getCollection("oplog.$main");
DBCursor lastCursor = oplog.find().sort(new BasicDBObject("$natural", -1)).limit(1);
Where
MongoDbFactory is an interface provifed by spring-data-mongo that can obtain a
com.mongodb.DB object and access allthe functionality of a specific MongoDB database
instance
your configuration file should contain these informations :
<bean id="mongoFactoryBean"
class="org.springframework.data.mongodb.core.MongoFactoryBean">
<property name="host" value="127.0.0.1"/>
<property name="port" value="27017"/>
</bean>
<bean id="mongoDbFactory"
class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo" ref="mongoFactoryBean"/>
<constructor-arg name="databaseName" value="local"/>
</bean>
doing it like that, spring should stay managing your connection pool.
You usually perform low level access through MongoTemplate's execute(…) methods that take callbacks giving you access to the native Mongo driver API.
class MyClient {
private final MongoOperations operations;
#Autowired
public MyClient(MongoOperations mongoOperations) {
this.operations = operations;
}
void yourMethod() {
operations.execute(new CollectionCallback<YourDomainClass>() {
YourDomainClass doInCollection(DBCollection collection) {
// here goes your low-level code
}
});
}
The advantage of this template approach is that the MongoTemplate instance backing the MongoOperations interface will still take care of all resource management and exception translation (converting all Mongo specific exceptions into Spring's DataAccessException hierarchy).
However, for your concrete example you could just go ahead and do the following directly:
Query query = new Query().with(new Sort(DESC, "$natural")).limit(1);
DBObject result = operations.find(query, DBObject.class, "oplog.$main");
Here you can mix and match the type you pass into the find(…) method to let the template convert the result into a Map or a domain object if needed. As indicated above you also get resource management and exception translation which your sample code above is missing.

Spring Transaction propagation nested Propagation REQUIRES_NEW in Propagation.REQUIRED

I'm using Spring 3.2 integrate with hibernate 4
Here is my code
#Service
public class MyService{
#AutoWired
private NestedServcie ns;
#Transactional(propagation=Propagation.REQUIRED)
public void outer(){
while(true){
dao.findOne(); // This method find data from db using hibernate hql
ns.inner(); // insert some data and commit and loop again.
}
}
}
#Service
public class NestedServcie{
#Transactional(propagation=Propagation.REQUIRES_NEW)
public void inner(){
//here insert some data into db using hibernate
}
}
Here is the spring config xml
<tx:annotation-driven transaction-manager="transactionManager"/>
<bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory"/>
</bean>
And here is my question
before I run this program, there is no data in db, so dao.findOne() is null in the first loop. But after ns.inner() excute, I insert some data into db and commit (where I think REQUIRES_NEW works). And when the second loop begin, dao.findOne is still null, the outer
can not get the inner insert data. why??
Thanks!!
There is already an ongoing transaction which basically has its own version of the data. The newly added data is not visible to that transaction. Next to that you have hibernate in the mix, which uses caching and depending on what is executed, the query is only executed once and on subsequent calls it simply returns the cached values (within the same transaction/session for instance).
Links
Can/Should spring reuse hibernate session for sub transaction
Transaction Isolation
Visibility of objects in different hibernate sessions

HibernateTemplate save performs inserts but not updates

I have a typical Spring / Hibernate setup. Here's my spring config:
<context:annotation-config />
<context:component-scan base-package="com.myco.myapp.modules" />
<tx:annotation-driven transaction-manager="transactionManager"/>
<bean id="sessionFactory"
...
</bean>
<bean id="transactionManager"
class="org.springframework.orm.hibernate3.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
I have a BaseRepository:
#Transactional(propagation = Propagation.MANDATORY)
public final T save(final T entity) throws RepositoryException {
try {
getHibernateTemplate().save(entity);
return entity;
} catch (DataAccessException e) {
throw new EntityCouldNotBeSavedException(getPersistentClass(),
e);
}
}
And a Repository class that extends it:
#Repository
public class PersonRepositoryImpl extends BaseRepositoryImpl<Person, String>
And a Service:
#Service
public class PersonServiceImpl {
#Autowired
private PersonRepository _personRespository;
I call the following method, saveSomeStuff(), an when I insert using BaseRepository.save() it works perfectly. But when I try to update, it doesn't make the change:
#Override
#Transactional
public void saveSomeStuff() {
try {
Person existingPerson = _personRespository.findById("1");
existingPerson.setName("John");
_personRespository.save(existingPerson);
Person dbExistingPerson = _personRespository.findById("1");
// This prints "John".
System.out.println(dbExistingPerson.getName());
Person newPerson = new Person();
newPerson.setName("Jack");
_personRespository.save(newPerson);
} catch (RepositoryException e) {
e1.printStackTrace();
}
}
I thought I might have a transaccionality problem, but as I said, upon leaving the Service method the new Person is persisted in the database. In the log I see:
insert into person ...
However, the update I made is not persisted, and there is no error and no 'update' sql statement in the log. I thought the HibernateTemplate.save() method might be the problem but from within the saveSomeStuff() method, after loading the Person from the database, I do a System.out, and the Person loaded from the database has the updated name.
What am I missing here?
There is a separate method, saveOrUpdate(entity). You can use it if you don't want hibernate to generate id while saving.
Save method will Persists an entity. Will assign an identifier if one doesn't exist. If one does, it's essentially doing an update. Returns the generated ID of the entity.
Figured out the problem. If I had included my Entity class, someone probably would have seen it sooner than me.
#Entity
#Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
#Immutable
#Table(name = "PEOPLE")
public class Person {
...
}
Initially I was getting a cache error:
java.lang.UnsupportedOperationException: Can't write to a readonly object
The quick solution? Add the #Immutable annotation. But if you read the docs for it:
An immutable entity may not be updated by the application.
Updates to an immutable entity will be ignored, but no exception is thrown.
Which explains why 1) updates were being ignored and 2) no exceptions were being thrown.
So I got rid of the #Immutable annotation and changed Cache to:
#Cache(usage = CacheConcurrencyStrategy.READ_WRITE)
And now everything works fine.
In summary: rtfm.
I had stumbled upon the same problem. The entity was getting inserted into the database, but while updating some of the columns where not getting updated and there were no errors in the log. After going through the entity class, I figured out that I had annotated some of my fields as below
#Column(name = "CREATED_DT", updatable = false)
private Date createdOn;
After removing the updatable attribute from the annotation, the update was working fine.

Resources