Spring Transactions not working with #Transactional - spring

I have following code...
#Transactional (rollbackFor = Exception.class)
public String log(AuditRecord aRecord) throws AuditException {
auditMaster(aRecord); //1. has an audit id and inserts into Audit table
System.out.println(aRecord.getAuditId());
if(true)
throw new AuditException(Error.DUMMY_ERROR);
auditSingleValued(aRecord); //2.
auditMultiValued(aRecord); //3
}
The three auditXXX() Methods in the log function are inserting into different tables. As I have declared #Transactional on the log method I am expecting that the insert on auditMaster() method (at 1.) should be rolled back because of the exception in the if(true) block. But that is not happening, when I check my database it has the new Audit Id entry that the System.out.println is printing.
What can be made to achieve the rollback and make it a Atomic Transaction on the log() method as a whole?
Do I need to declare #Transacional on individual auditXXX methods as well?
My bean configuration has this entry...
<bean id="txManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>

Related

Get the object which failed validation Spring Batch validation

I am having this task to process input .csv, .txt files and store the data into a database. I am using Spring Batch for this purpose. Before dumping the data into database, I have to perform some validation checks on the data. I am using Spring Batch's ValidatingItemProcessor and Hibernate's JSR-303 reference implementation hibernate validator for the same. The code looks something like:
public class Person{
#Pattern(regexp = "someregex")
String name;
#NotNull
String address;
#NotNull
String age;
//getters and setters
}
And then I wrote a validator which looks something like this --
import javax.validation.ConstraintViolation;
import javax.validation.Validation;
import javax.validation.ValidatorFactory;
import org.springframework.batch.item.validator.ValidationException;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.batch.item.validator.Validator;
class MyBeanValidator implements Validator<Person>, InitializingBean{
private javax.validation.Validator validator;
#Override
public void afterPropertiesSet() throws Exception {
ValidatorFactory validatorFactory = Validation.buildDefaultValidatorFactory();
validator = validatorFactory.usingContext().getValidator();
}
#Override
public void validate(Person person) throws ValidationException {
Set<ConstraintViolation<Object>> constraintViolations = validator.validate(person);
if(constraintViolations.size() > 0) {
generateValidationException(constraintViolations);
}
}
private void generateValidationException(Set<ConstraintViolation<Object>> constraintViolations) {
StringBuilder message = new StringBuilder();
for (ConstraintViolation<Object> constraintViolation : constraintViolations) {
message.append(constraintViolation.getMessage() + "\n");
}
throw new ValidationException(message.toString());
}
And then I have a processor which subclasses Spring Batch's ValidatingItemProcessor.
public class ValidatingPersonItemProcessor extends ValidatingItemProcessor<Person>{
#Override
public Person process(Person person) {
//some code
}
The records that pass validation checks would be passed on to another processor for further processing but the failed ones will be cleaned and then passed on to next processor.
Now I want to catch hold of records which failed validation. My objective is to report all input records that failed validation and clean those records further before I could pass on those records to next processor for further processing. How can I achieve this?
Will the Spring Batch process terminate if validation fails for some input? If yes, how to avoid that? My Processor configuration looks something like :
<batch:chunk reader="personItemReader" writer="personDBWriter" processor="personProcessor"
commit-interval="100" skip-limit="100">
<batch:skippable-exception-classes>
<batch:include class="org.springframework.batch.item.validator.ValidationException"/>
</batch:skippable-exception-classes>
<batch:listeners>
<batch:listener>
<bean
class="org.someorg.poc.batch.listener.PersonSkipListener" />
</batch:listener>
</batch:listeners>
</batch:chunk>
<bean id="personProcessor"
class="org.springframework.batch.item.support.CompositeItemProcessor">
<property name="delegates">
<list>
<ref bean="validatingPersonItemProcessor" />
<ref bean="personVerifierProcessor" />
</list>
</property>
</bean>
<bean id="validatingPersonItemProcessor" class="org.someorg.poc.batch.processor.ValidatingPersonItemProcessor" scope="step">
<property name="validator" ref="myBeanValidator" />
</bean>
<bean id="myBeanValidator" class="org.someorg.poc.batch.validator.MyBeanValidator">
</bean>
<bean id="personVerifierProcessor" class="org.someorg.poc.batch.processor.PersonVerifierProcessor" scope="step"/>
</beans>
I guess your validatingPersonItemProcessor bean has his validator parameter set with your myBeanValidator. So the Exception will be thrown by the processor.
Create your own SkipListener. Here you put the logic on what happens when an item is not validated (writtes to a file, a DB, etc.), in the onSkipInProcess();.
You need to add the ValidationException you throw in <batch:skippable-exception-classes> so they will be caught (and doesn't terminate your batch), and add your SkipListener in the <batch:listeners>, so it will be call when an exception is thrown.
EDIT: Answer to comment.
If your processor is a ValidatingItemProcessor and you set the validator, it should automatically call validate. However if you make your own ValidatingItemProcessor by extending it, you should explicitely call super.process(yourItem); (process() of ValidatingItemProcessor ) to validate your item.

Is my app's Controllers thread safe ? Spring 4.1

I am developing an app in spring 4.1 . I know that Controllers / any other bean in spring are not thread safe . ie: Singleton. That mean same instance of Controller will be used to process multiple concurrent requests. Till here I am clear . I want to confirm that do I need to explicitly set #Scope("prototype") or request in the Controller class ? I read on StackOverflow previous post that even if scope is not set as request/prototype , Spring container will be able to process each request individually based on #RequestParams passed or #ModelAttribute associated with method arguements .
So i want to confirm is my below code is safe to handle multiple request concurrently ?
#Controller
public class LogonController {
/** Logger for this class and subclasses */
protected final Log logger = LogFactory.getLog(getClass());
#Autowired
SimpleProductManager productManager;
#Autowired
LoginValidator validator;
#RequestMapping( "logon")
public String renderForm(#ModelAttribute("employee") Logon employeeVO)
{
return "logon";
}
#RequestMapping(value="Welcome", method = RequestMethod.POST)
public ModelAndView submitForm(#ModelAttribute("employee") Logon employeeVO,
BindingResult result)
{
//Check validation errors
validator.validate(employeeVO, result);
if (result.hasErrors()) {
return new ModelAndView("logon");
}
if(!productManager.chkUserValidation(employeeVO.getUsername(), employeeVO.getPassword())){
return new ModelAndView("logon");
}
ModelAndView model = new ModelAndView("Welcome");
return model ;
}
}
Also i have another doubt.
since i am using SimpleProductManager productManager; Do i need to specify scope="prototype in its bean declaration in app-servlet.xml ?
Below is my configuration.xml
<bean id="mySessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="dataSource"><ref bean="dataSource"/></property>
<property name="configLocation" value="classpath:hibernate.cfg.xml" />
</bean>
<bean id="productManager" class="com.BlueClouds.service.SimpleProductManager" >
<property name="productDao" ref="productDao"/>
</bean>
<bean id="productDao" class="com.BlueClouds.dao.HbmProductDao">
<property name="sessionFactory"><ref bean="mySessionFactory"/></property>
</bean>
<bean id="loginValidator" class="com.BlueClouds.service.LoginValidator" >
</bean>
Being singleton single instance of validator is being shared among all request , for that do i need to add scope=request in bean configuration xml or do i need to surround validate() in synchronized block ? Please advise.
Thanks much .
You can tell your code is thread safe or not by answering following questions
Are there threads might modify a static field, which is not thread safe(ex: arrayList), in the same time?
Are there threads might modify a field of an instance, which is not thread safe, in the same time?
If any answer of the above is yes, then your code is not thread safe.
Since your code doesn't change any field, so it should be thread safe.
The general idea about thread safe is that if there are threads might change/access the same memory section in the same time, then it's not thread safe, which means "synchronized" is needed.
You'd better learn more about stack memory, heap memory and global memory in JAVA. So that you can understand if your code changes the same memory section in the same time or not.

Spring Transaction propagation nested Propagation REQUIRES_NEW in Propagation.REQUIRED

I'm using Spring 3.2 integrate with hibernate 4
Here is my code
#Service
public class MyService{
#AutoWired
private NestedServcie ns;
#Transactional(propagation=Propagation.REQUIRED)
public void outer(){
while(true){
dao.findOne(); // This method find data from db using hibernate hql
ns.inner(); // insert some data and commit and loop again.
}
}
}
#Service
public class NestedServcie{
#Transactional(propagation=Propagation.REQUIRES_NEW)
public void inner(){
//here insert some data into db using hibernate
}
}
Here is the spring config xml
<tx:annotation-driven transaction-manager="transactionManager"/>
<bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory"/>
</bean>
And here is my question
before I run this program, there is no data in db, so dao.findOne() is null in the first loop. But after ns.inner() excute, I insert some data into db and commit (where I think REQUIRES_NEW works). And when the second loop begin, dao.findOne is still null, the outer
can not get the inner insert data. why??
Thanks!!
There is already an ongoing transaction which basically has its own version of the data. The newly added data is not visible to that transaction. Next to that you have hibernate in the mix, which uses caching and depending on what is executed, the query is only executed once and on subsequent calls it simply returns the cached values (within the same transaction/session for instance).
Links
Can/Should spring reuse hibernate session for sub transaction
Transaction Isolation
Visibility of objects in different hibernate sessions

Why hibernate flushes on select queries (EmptyInterceptor)?

I would like to understand a counter intuitive Hibernate behaviour I am seeing. I always thought that "flush" meant that hibernate had a data structure in-memory that has to be written to the DB. This is not what I am seeing.
I have created the following Interceptor:
public class FeedInterceptor extends EmptyInterceptor
{
#Override
public void postFlush(Iterator entities)
{
System.out.println("postFlush");
}
}
Registered it in my ApplicationContext
<bean id="sessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="entityInterceptor">
<bean class="interceptor.FeedInterceptor"/>
</property>
</bean>
But, strange enough, I see "postFlush" written to the console for every row retrieved from the DB from my DAO:
Session session = sessionFactory.getCurrentSession();
Query query = session.createQuery("from Feed feed");
query.list();
Why is that?
Let's assume Hibernate wouldn't flush the session, then you could have the following situation:
Person p = new Person();
p.setName("Pomario");
dao.create(p);
Person pomario = dao.findPersonByName("Pomario")
//pomario is null?
When finding a person by name, hibernate issues a select statement to the database. If it doesn't send the previous statements first, then the returned result might not be consistent with changes that have been done previously in the session, here the dababase hasn't received the create statement yet, so it returns an empty result set.

Spring #Transactional wrapping 2 methods

I'm a Spring newby. I use the #Transactional annotation for my dao methods:
#Transactional
public Person getById(long id) {
return new Person(jdbcTemplate.queryForMap(...));
}
#Transactional
public void save(Person person) {
jdbcTemplate.update(...);
}
and I've set up the transaction manager like this:
<tx:annotation-driven transaction-manager="txManager" />
<bean id="txManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource" />
</bean>
The problem is that when my client code calls dao.save(..) and then dao.getById(4) these happen in two separate transactions. How is it possible to wrap those 2 calls in the same database transaction? Ideally without doing it in a programmatic way.
thanks
It is bad practice to put transactional attributes in DAO layer. Also, I am not sure why do you require transaction for getById method. Even if you want to use transaction then you need to specify propagation behaviour as REQUIRES_NEW for save and getById method.
#Transactional(propagation = REQUIRES_NEW, readOnly = false)
public Person saveAndGetById(Person person, long id) {
save(person);
return getById(id);
}
#Transactional(propagation = REQUIRED)
public Person getById(long id) {
return new Person(jdbcTemplate.queryForMap(...));
}
#Transactional(propagation = REQUIRED, readOnly = false)
public void save(Person person) {
jdbcTemplate.update(...);
}
However, the best thing would be to have the "save" method return an ID, because it is hard to know beforehand which ID the Person will have once persisted.
Good practice in this case would be marking service method which invokes both these DAO methods as #Transactional. The case was clearly discussed here.

Resources