Spring Data lazyload does not LazyInitializationException - spring

#Component
#Transactional
public class TestClass extends AbstractClass
{
#Autowire
ClassARepo classARepo;
#Override
public void test() {
ClassA classA = classARepo.findOne(1);
List<ClassB> list = classA.getClassBs();
list.size();
}
}
ClassB is mapped as onetomany and lazily loaded.
In the above code
classARepo.findOne(1);
Executes correctly. but
List<ClassB> list = classA.getClassBs();
list.size();
Fails with LazyInitializationException.
public interface ClassARepo extends CrudRepository<ClassA, Integer> {
}
Instance for TestA is created like the one below
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
#Transactional
#Component
public class TestClassJOB extends AbstractJob
{
#Autowired
TestClass indexer;
}
Context:
<!-- JPA mapping configuration -->
<bean id="persistenceXmlLocation" class="java.lang.String">
<constructor-arg value="classpath:/persistence.xml"></constructor-arg>
</bean>
<!-- entity manager -->
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
p:dataSource-ref="dataSource" p:persistenceUnitName="jpaData"
p:persistenceXmlLocation-ref="persistenceXmlLocation">
<property name="packagesToScan" value="com..persist.entity" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
</property>
</bean>
<!-- transaction manager -->
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager"
p:entityManagerFactory-ref="entityManagerFactory" lazy-init="true" p:dataSource-ref="dataSource" />
<!-- JPA repositories -->
<jpa:repositories base-package="com..persist.repo"
entity-manager-factory-ref="entityManagerFactory" transaction-manager-ref="transactionManager" />
I tried many resources and could not solve the issue. The following error message is displayed "could not initialize proxy - no Session".
What could be the cause of the issue?
When the session is available while classARepo.findOne(1) is called, why is not available during lazy fetch(list.size())?

The issue was the instance for TestClassJOB was created by Quartz. So the transnational proxy was not applied to the class which was the reason for the issue.
I fixed the issue by declaring a transaction template
#Autowired
TransactionTemplate transactionTemplate;
and then wrapping the code within
transactionTemplate.execute(new TransactionCallbackWithoutResult()
{
#Override
protected void doInTransactionWithoutResult(TransactionStatus status)
{
<code here>
}
}

Related

Spring batch doesnt start transaction when using custom writer

I am working on a spring batch which reads from a csv file and writes into database. i am using FlatFileItemReader for reading the file and implemented ItemWriter which uses Jpa to insert data into database. But batch fails with no transaction in progress.
Here is my job configuration
<bean id="datasource" class="oracle.jdbc.pool.OracleDataSource">
<property name="user" value="xxx" />
<property name="password" value="xx" />
<property name="URL" value="xxx" />
</bean>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
<bean id="entityManagerFactory" name="model"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="datasource"/>
<property name="jpaVendorAdapter">
<bean
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="generateDdl" value="false"/>
<property name="showSql" value="true"/>
<property name="database">
<util:constant
static-field="org.springframework.orm.jpa.vendor.Database.ORACLE"/>
</property>
<property name="databasePlatform" value="org.hibernate.dialect.Oracle12cDialect"/>
</bean>
</property>
</bean>
<batch:job id="testJob">
<batch:step id="step">
<batch:tasklet>
<batch:chunk reader="cvsFileItemReader" writer="databaseWriter"
commit-interval="10">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
And below is my writer
public class DatabaseWriter implements ItemWriter<Report> {
private EntityManagerFactory entityManagerFactory;
#Autowired
public DatabaseWriter(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
#Override
public void write(List<? extends Report> list) throws Exception {
EntityManager entityManager = entityManagerFactory.createEntityManager();
for (Report report : list) {
entityManager.persist(report );
}
entityManager.flush();
}
}
It only works if start transaction explicitly .that is like below
public class DatabaseWriter implements ItemWriter<Report> {
private EntityManagerFactory entityManagerFactory;
#Autowired
public DatabaseWriter(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
#Override
public void write(List<? extends Report> list) throws Exception {
EntityManager entityManager = entityManagerFactory.createEntityManager();
entityManager.getTransaction().begin();
for (Report report : list) {
entityManager.persist(report );
}
entityManager.getTransaction().commit();
}
}
Is it really needed that transaction should be maintained explicitly and spring batch doesn't control it out of box?
EDIT
I fixed it by changing writer like
public class DatabaseWriter implements ItemWriter<Report> {
#PersistenceContext
private EntityManager entityManager;
#Override
public void write(List<? extends Report> list) {
for (Report report : list) {
entityManager.persist(report );
}
}
Changing EntityMangerFactory to EntityManager with PersistenceContext fixed the problem. But i am struggling to understand reason for this behaviour
Create DAO and move your JPA code there and autowire the DAO in your writer.
Also make sure you autowire the entity manager like below.
#PersistenceContext
private EntityManager em;
But i am struggling to understand reason for this behaviour
The first approach does not work because you create a transaction manually while your code is actually running within the scope of a transaction driven by Spring Batch. So there will be two transactions with different contexts.
You should keep in mind that a tasklet is executed in a transaction driven by Spring Batch (including the item writer for a chunk-oriented tasklet). So any code running in that scope should conform to the transaction definition of the tasklet. In your case, the EntityManager injected in the writer is driven by the JpaTransactionManager you defined, which is also used by Spring Batch for the tasklet's transaction.
As a side note, you can use the JpaItemWriter provided by Spring Batch instead of writing a custom writer. The code is almost identical.

Write operations are not allowed with spring and hibernate

when I try to save an object I get write operations are not allowed.
Here is the code.
configuration for transaction manager
<bean id="hibernateTemplate" class="org.springframework.orm.hibernate4.HibernateTemplate" >
<property name="sessionFactory" ref="sessionFactory"/>
</bean>
<bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory"/>
</bean>
<tx:annotation-driven />
Service Class
#Service
public class MemberDetailServiceImpl implements MemberDetailService {
#Autowired
private MemberDetailsDao memberDetailsDao;
#Transactional(readOnly = false)
public String saveExtraInfoMember(MemberActivity activity){
return memberDetailsDao.saveExtraInfoMember(activity);
}
Dao
#Repository
public class MemberDetailsDaoImpl implements MemberDetailsDao {
#Autowired
HibernateTemplate hibernateTemplate;
public String saveExtraInfoMember(MemberActivity activity) {
// TODO Auto-generated method stub
String result=null;
hibernateTemplate.saveOrUpdate(activity);
return "";
}

Spring 3 MVC:java.lang.IllegalArgumentException: Property 'dataSource' is required.How to set JdbcTemplate correctly?

I'm new to Spring development.And right now,i'm really facing a problem.Here are the code snippets to make you realize my problem clearly.............
Here is my DAO class:
public class LoginDaoImpl {
private DataSource dataSource;
private JdbcTemplate jdbcTemplate;
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
public int checkLoginDetails(LoginVo loginVo){
String sql = "select count(*) from empsctygrp where username=? and password=?";
jdbcTemplate = new JdbcTemplate(dataSource);
int count = jdbcTemplate.queryForObject(sql,new Object[]{loginVo.getUserName(),loginVo.getPassword()},Integer.class);
return count;
}
}
Now here is my Business-Object(BO) class:
public class LoginBo {
LoginDaoImpl loginDaoImpl = new LoginDaoImpl();
public int checkLoginDetails(LoginVo loginVo){
return loginDaoImpl.checkLoginDetails(loginVo);
}
}
Now,here is my dispatcher-servlet xml code:
<bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver"/>
<property name="url" value="jdbc:oracle:thin:#117.194.83.9:1521:XE"/>
<property name="username" value="system"/>
<property name="password" value="password1$"/>
</bean>
<bean id="loginDaoImpl" class="com.abhinabyte.dao.LoginDaoImpl">
<property name="dataSource" ref="dataSource" />
</bean>
Now whenever i'm trying to run this on server the following exception is given:
SEVERE: Servlet.service() for servlet [dispatcher] in context with path [/A] threw exception [Request processing failed; nested exception is java.lang.IllegalArgumentException: Property 'dataSource' is required] with root cause
java.lang.IllegalArgumentException: Property 'dataSource' is required
Please help me solve this problem.............:(
Try this in LoginBo class:
#Autowired
LoginDaoImpl loginDaoImpl;
instead of
LoginDaoImpl loginDaoImpl = new LoginDaoImpl();
The problem is that you manually instantiate LoginDaoImpl.
I was having the same problem and could not find a comprehensive answer on the web, so I decided to post one here for anyone else, or for future me.
I'm still learning so if you think I have made a mistake below, please feel free to edit.
Summary:
Include <integration:annotation-config/> <context:component-scan base-package="myproject"/> in your servlet to pick up annotations
Configure JUnit tests with #RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("file:WEB-INF/FinanceImportTool-servlet.xml")
Don't autowire dataSource or jdbcTemplate if these fields are already provided by a parent class e.g. StoredProcedure
Don't use new() as this initializes classes outside the applicationContext
Beware of using properties in your constructor which have not yet been set - obvious but embarrassingly easy to do
My original class (now altered):
public class MyDAOImpl extends StoredProcedure implements MyDAO {
private static final String SPROC_NAME = "dbo.MySP";
public MyDAOImpl(DataSource dataSource) {
super(dataSource, SPROC_NAME);
// ...declared parameters...
compile();
}
}
MyProject-servlet.xml file (only relevant bits included):
<!-- Used by Spring to pick up annotations -->
<integration:annotation-config/>
<context:component-scan base-package="myproject"/>
<bean id="MyDAOBean" class="myproject.dao.MyDAOImpl" >
<constructor-arg name="dataSource" ref="myDataSource"/>
</bean>
<!-- properties stored in a separate file -->
<bean id="myDataSource" class="com.microsoft.sqlserver.jdbc.SQLServerDataSource">
<property name="databaseName" value="${myDataSource.dbname}" />
<property name="serverName" value="${myDataSource.svrname}" />
<!-- also loaded portNumber, user, password, selectMethod -->
</bean>
Error: property 'dataSource' is required, or NullPointerException (1)
Other answers say make sure you have passed dataSource as a <property> for your bean in the servlet, etc.
I think #Abhinabyte the OP needed to annotate his setDataSource() method with #Annotation, and use <integration:annotation-config/> <context:component-scan base-package="myproject"/> in his servlet to successfully pass in dataSource as a dependency to LoginDaoImpl.
In my case, I tried adding 'dataSource' as a property and autowiring it. The "dataSource is required" error message became a NullPointerException error.
I realised after far too long that MyDAOImpl extends StoredProcedure.
dataSource was already a property of StoredProcedure. By having a dataSource property for MyDAOImpl, the autowiring was not picking up and setting the dataSource property of StoredProcedure, which left dataSource for StoredProcedure as null.
This was not picked up when I tested the value of MyDAOImpl.dataSource, as of course by now I had added a MyDAOImpl.dataSource field that had been autowired successfully. However the compile() method inherited from StoredProcedure used StoredProcedure.dataSource.
Therefore I didn't need public DataSource dataSource; property in MyDAOImpl class. I just needed to use the StoredProcedure constructor with super(dataSource, sql); in the constructor for MyDAOImpl.
I also didn't need a MyDAOImpl.jdbcTemplate property. It was set automatically by using the StoredProcedure(dataSource, sql) constructor.
Error: NullPointerException (2)
I had been using this constructor:
private static final String SPROC_NAME = "dbo.MySP";
public MyDAOImpl(DataSource dataSource) {
super(dataSource, SPROC_NAME);
}
This caused a NullPointerException because SPROC_NAME had not been initialized before it was used in the constructor (yes I know, rookie error). To solve this, I passed in sql as a constructor-arg in the servlet.
Error: [same error message appeared when I had changed file name]
The applicationContext was referring to the bin/ instances of my beans and classes. I had to delete bin/ and rebuild the project.
My new class:
public class MyDAOImpl extends StoredProcedure implements MyDAO {
#Autowired // Necessary to prevent error 'no default constructor found'
public MyDAOImpl(DataSource dataSource, String sql) {
super(dataSource, sql);
// ...declared parameters...
compile();
}
New MyProject-servlet.xml file (only relevant bits included):
<!-- Used by Spring to pick up annotations -->
<integration:annotation-config/>
<context:component-scan base-package="myproject"/>
<bean id="myDAOBean" class="org.gosh.financeimport.dao.MyDAOImpl" >
<constructor-arg name="dataSource" ref="reDataSource"/>
<constructor-arg name="sql" value="dbo.MySP" />
</bean>
<!-- properties stored in a separate file -->
<bean id="myDataSource" class="com.microsoft.sqlserver.jdbc.SQLServerDataSource">
<property name="databaseName" value="${myDataSource.dbname}" />
<property name="serverName" value="${myDataSource.svrname}" />
<!-- also loaded portNumber, user, password, selectMethod -->
</bean>
Helpful places:
If you can get past the rage, this answer on Spring forums might help too
This answer gives a broad introduction to Spring configuration
This answer has simple but useful suggestions
You should annotate that beans that will suffer IoC. Like
#Bean public class LoginDAOImpl { #Inject DataSource dataSource;......}
You set up in spring context this beans, but, you're not using them.
OBS:
When I use the JDBCTemplate I configure de IoC of JDBC like
<bean id="dataSourcePerfil" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="${br.com.dao.jdbc.driver}" />
<property name="url" value="${br.com.dao.jdbc.url}" />
<property name="username" value="${br.com.dao.jdbc.user}" />
<property name="password" value="${br.com.dao.jdbc.pass}" />
</bean>
<bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
<constructor-arg ref="dataSourcePerfil" />
</bean>
then.... after at all
#Bean
public class LoginDAOImpl {
#Autowired
private JdbcTemplate jdbcTemplate;
#Override
public List<ClienteReport> getClientes() {
return Collections<ClienteReport>. emptyList();
}
}

Groovy and spring under same transaction

I'm struggling with legacy code. I'm creating unit tests so I've decided to use groovy to fill database with required legacy data. Normally in my code I using ibatis for persistence. I'd like to rollback test in the end. Problem is that when I create row via groovy then I use it's id to create row via ibatis I get constraint violation exception - parent key not found.
When I use groovy to persist parent and than create child based on parents id it works perfectly fine.
Also I can't use #Transactional because of problems with XML parser (legacy code FTW :/ )
#ContextConfiguration(locations = [ "../dao/impl/ibatis/spring-data-context-config.xml", "classpath:/pl/com/betacom/treq/dao-context.xml"])
#RunWith(SpringJUnit4ClassRunner.class)
public class FinancingForIltCreationTest {
#Autowired
IFinancingForIltDAO financingForIltDAO;
#Autowired
Sql sql;
#Autowired
DataSourceTransactionManager transactionManager;
private TransactionStatus transactionStatus;
#Before
public void setUp() {
transactionStatus = transactionManager.getTransaction(new DefaultTransactionDefinition());
}
#After
public void tearDown() {
transactionManager.rollback(transactionStatus);
transactionStatus = null;
}
#Test
public void shallCreateFinancingForIlt() throws Exception {
//given
IltOffering offering = new IltOffering("GOING_DOWN_TO_UBERGROUND", offeringTempId, java.sql.Date.valueOf("2011-07-21"), java.sql.Date.valueOf("2012-07-21"));
offering.insert(sql); // it's inserted by groovy
//when
FinancingForIltDTO financingForIltDTO = createFinancingForIlt(offering.id).build(financingForIltDAO); // it's my assembler inserting via iBatis
//then
assertNotNull(financingForIltDTO.id);
}
Configuration looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
<bean id="dataSourceIn"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName">
<value>####</value>
</property>
<property name="url">
<value>####</value>
</property>
<property name="username">
<value>####</value>
</property>
<property name="password">
<value>####</value>
</property>
</bean>
<bean id="dataSource"
class="org.springframework.jdbc.datasource.TransactionAwareDataSourceProxy">
<constructor-arg ref="dataSourceIn" />
</bean>
<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource">
<ref local="dataSource" />
</property>
</bean>
<bean id="sql" class="groovy.sql.Sql">
<constructor-arg ref="dataSource" />
</bean>
Unfortunately it was a database schema issue.

How to use #Autowired in a Quartz Job?

i am using quartz with spring
and i want to inject/use another class in the job class
and i don't know how to do it correctly
the xml:
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.5.xsd">
<!-- Scheduler task -->
<bean name="schedulerTask" class="com.mkyong.quartz.SchedulerTask" />
<!-- Scheduler job -->
<bean name="schedulerJob"
class="org.springframework.scheduling.quartz.JobDetailBean">
<property name="jobClass" value="com.mkyong.quartz.SchedulerJob" />
<property name="jobDataAsMap">
<map>
<entry key="schedulerTask" value-ref="schedulerTask" />
</map>
</property>
</bean>
<!-- Cron Trigger -->
<bean id="cronTrigger"
class="org.springframework.scheduling.quartz.CronTriggerBean">
<property name="jobDetail" ref="schedulerJob" />
<property name="cronExpression" value="0/10 * * * * ?" />
</bean>
<!-- Scheduler -->
<bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="jobDetails">
<list>
<ref bean="schedulerJob" />
</list>
</property>
<property name="triggers">
<list>
<ref bean="cronTrigger" />
</list>
</property>
</bean>
</beans>
the quartz job:
package com.mkyong.quartz;
import org.quartz.JobExecutionContext;
import org.quartz.JobExecutionException;
import org.springframework.scheduling.quartz.QuartzJobBean;
public class SchedulerJob extends QuartzJobBean
{
private SchedulerTask schedulerTask;
public void setSchedulerTask(SchedulerTask schedulerTask) {
this.schedulerTask = schedulerTask;
}
protected void executeInternal(JobExecutionContext context)
throws JobExecutionException {
schedulerTask.printSchedulerMessage();
}
}
the task to be executed:
package com.mkyong.quartz;
public class SchedulerTask {
public void printSchedulerMessage() {
System.out.println("Struts 2 + Spring + Quartz ......");
}
}
i want to inject another DTO class that deals with Database in the task class
to do some database work in the task, how to do that ?
In your solution you are using the spring #Autowired annotation in a class that is not instantiated by Spring. Your solution will still work if you remove the #Autowired annotation because Quartz is setting the property, not Spring.
Quartz will try to set every key within the JobDataMap as a property. E.g. since you have a key "myDao" Quartz will look for a method called "setMyDao" and pass the key's value into that method.
If you want Spring to inject spring beans into your jobs, create a SpringBeanJobFactory and set this into your SchedulerFactoryBean with the jobFactory property within your spring context.
SpringBeanJobFactory javadoc:
Applies scheduler context, job data map and trigger data map entries
as bean property values
Not sure if this is what you want, but you can pass some configuration values to the Quartz job. I believe in your case you could take advantage of the jobDataAsMap property you already set up, e.g.:
<property name="jobDataAsMap">
<map>
<entry key="schedulerTask" value-ref="schedulerTask" />
<entry key="param1" value="com.custom.package.ClassName"/>
</map>
</property>
Then you should be able to access it in your actual Java code in manual way:
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
schedulerTask.printSchedulerMessage();
System.out.println(context.getJobDetail().getJobDataMap().getString("param1"));
}
Or using the magic Spring approach - have the param1 property defined with getter/setter. You could try defining it with java.lang.Class type then and have the done automatically (Spring would do it for you):
private Class<?> param1;
// getter & setter
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
schedulerTask.printSchedulerMessage();
System.out.println("Class injected" + getParam1().getName());
}
I haven't tested it though.
ApplicationContext springContext =
WebApplicationContextUtils.getWebApplicationContext(
ContextLoaderListener.getCurrentWebApplicationContext().getServletContext()
);
Bean bean = (Bean) springContext.getBean("beanName");
bean.method();
As mentioned in inject bean reference into a Quartz job in Spring? you can use spring SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(this);
#Named
public class SampleJob implements Job {
#Inject
private AService aService;
#Override
public void execute(JobExecutionContext context)
throws JobExecutionException {
//Do injection with spring
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(this);
aService.doIt();
}
}
As mentioned it may not wotk on some spring version but I have tested it on 4.2.1.RELEASE which worked fine.
this is my solution:
public class MySpringBeanJobFactory extends
org.springframework.scheduling.quartz.SpringBeanJobFactory implements
ApplicationContextAware {
private ApplicationContext ctx;
#Override
public void setApplicationContext(ApplicationContext applicationContext)
throws BeansException {
this.ctx = applicationContext;
}
#Override
protected Object createJobInstance(TriggerFiredBundle bundle)
throws Exception {
Object jobInstance = super.createJobInstance(bundle);
ctx.getAutowireCapableBeanFactory().autowireBean(jobInstance);
return jobInstance;
}
}
then config the class of MySpringBeanJobFactory in the xml:
<bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="jobFactory">
<bean class="com.xxxx.MySpringBeanJobFactory" />
</property>
<property name="configLocation" value="classpath:quartz.properties" />
<property name="triggers">
<list>
<ref bean="cronTrigger"/>
</list>
</property>
</bean>
Good luck ! :)

Resources