Database connections not being closed with jpaFlowExecutionListener - spring

I'm using Spring Web Flow to build an application. I am making use of the Flow Managed Persistence Context so the entity manager is 'kept open' during the execution of my flow and I can access lazy loaded properties (similar to OpenEntityManagerInViewFilter or OpenSessionInViewFilter for Spring MVC). When I use this, every time I submit a form, the number of active database connections increases, if I don't use the FMPC, I have no problems with the number of open connections).
I'm working with the following setup.
TransactionManager:
#Bean
#Autowired
public JpaTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
DataSource:
#Bean
public DataSource dataSource() {
final BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(environment.getRequiredProperty(PROPERTY_DATABASE_DRIVER));
dataSource.setUrl(environment.getRequiredProperty(PROPERTY_DATABASE_URL));
dataSource.setUsername(environment.getProperty(PROPERTY_DATABASE_USERNAME, ""));
dataSource.setPassword(environment.getProperty(PROPERTY_DATABASE_PASSWORD, ""));
return dataSource;
}
EntityManagerFactory:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
final LocalContainerEntityManagerFactoryBean factoryBean = new LocalContainerEntityManagerFactoryBean();
factoryBean.setDataSource(dataSource());
factoryBean.setPackagesToScan(environment.getRequiredProperty(PROPERTY_ENTITYMANAGER_PACKAGES_TO_SCAN));
final JpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter() {
{
setDatabase(Database.valueOf(environment.getRequiredProperty(PROPERTY_DATABASE_TYPE)));
setDatabasePlatform(environment.getRequiredProperty(PROPERTY_HIBERNATE_DIALECT));
}
};
factoryBean.setJpaVendorAdapter(vendorAdapter);
final Properties jpaProperties = new Properties();
jpaProperties.put(PROPERTY_HIBERNATE_FORMAT_SQL, environment.getRequiredProperty(PROPERTY_HIBERNATE_FORMAT_SQL));
jpaProperties.put(PROPERTY_HIBERNATE_NAMING_STRATEGY, environment.getRequiredProperty(PROPERTY_HIBERNATE_NAMING_STRATEGY));
jpaProperties.put(PROPERTY_HIBERNATE_SHOW_SQL, environment.getRequiredProperty(PROPERTY_HIBERNATE_SHOW_SQL));
jpaProperties.put(PROPERTY_HIBERNATE_HB2DDL_SQL, environment.getRequiredProperty(PROPERTY_HIBERNATE_HB2DDL_SQL));
factoryBean.setJpaProperties(jpaProperties);
return factoryBean;
}
JpaFlowExecutionListener:
#Bean
#Autowired
public JpaFlowExecutionListener jpaFlowExecutionListener(EntityManagerFactory entityManagerFactory, JpaTransactionManager transactionManager) {
return new JpaFlowExecutionListener(entityManagerFactory, transactionManager);
}
The BasicDataSource has maxActive set to 8 by default and when I reach 8 active connections, the page just hangs. Why are the connections not being closed after the request is complete? I have used the Chrome debugging tools (the network pane) to make sure there are not AJAX requests running or anything, my page submit (an HTTP POST) triggers a 301 redirect which then gives me a new HTTP GET and that results in a status 200, so all good.
When going from one page to the next, a service layer is called but as you can see from my beans, I am using the JpaTransactionManager and the SWF documentation says the following:
Note: All data access except for the final commit will, by default, be non-transactional. However, a flow may call into a transactional service layer to fetch objects during the conversation in the context of a read-only system transaction if the underlying JPA Transaction Manager supports this. Spring's JPA TransactionManager does support this when working with a Hibernate JPA provider, for example. In that case, Spring will handle setting the FlushMode to MANUAL to ensure any in-progress changes to managed persistent entities are not flushed, while reads of new objects occur transactionally.
For the sake of completeness, my spring-web-flow config:
<beans xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:webflow="http://www.springframework.org/schema/webflow-config"
xmlns="http://www.springframework.org/schema/beans"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/webflow-config
http://www.springframework.org/schema/webflow-config/spring-webflow-config.xsd">
<!-- Flow executor, repsonsible for creating and executing flows -->
<webflow:flow-executor id="flowExecutor" flow-registry="flowRegistry">
<webflow:flow-execution-listeners>
<webflow:listener ref="jpaFlowExecutionListener"/>
</webflow:flow-execution-listeners>
</webflow:flow-executor>
<!-- Flow registry, responsible for loading all flows so executor can execute them -->
<webflow:flow-registry id="flowRegistry" base-path="/WEB-INF/webflow/flows" flow-builder-services="flowBuilderServices">
<webflow:flow-location-pattern value="/**/*-flow.xml"/>
</webflow:flow-registry>
<!-- Flow builder services -->
<webflow:flow-builder-services id="flowBuilderServices" view-factory-creator="mvcViewFactoryCreator"/>
<!-- MvcViewFactoryCreator -->
<bean id="mvcViewFactoryCreator" class="org.springframework.webflow.mvc.builder.MvcViewFactoryCreator">
<property name="viewResolvers">
<list>
<ref bean="viewResolver"/>
</list>
</property>
</bean>
<!-- Flow handler adapter, responsible for answering request for a flow -->
<bean class="org.springframework.webflow.mvc.servlet.FlowHandlerAdapter">
<property name="flowExecutor" ref="flowExecutor"/>
</bean>
<!-- Flow handler mapping, lets Spring MVCs DispatcherServlet know to send flow request to SWF -->
<bean class="org.springframework.webflow.mvc.servlet.FlowHandlerMapping">
<property name="flowRegistry" ref="flowRegistry"/>
<property name="order" value="0"/>
<property name="interceptors">
<list>
<ref bean="localeChangeInterceptor" />
</list>
</property>
</bean>
</beans>
My flow has <persistence-context /> defined at the top.
I have the following end-state (which restarts the flow), even when I invoke this and the URL params change to e2s1, the number of active connections is not reset:
<end-state id="restart" commit="true" view="redirect:/main"/>

So it seems that the default hibernate property for hibernate.connection.release_mode is on_close. Considering the EntityManager is kept open during the whole flow, it never closes and a new connection is fetched from the pool for every request within the flow.
Changing the property to after_transaction solves this issue. However, in the case of fetching lazily loaded collections, it still doesn't work, each lazy property will fetch a new connection from the pool. In order to solve this I extended the JpaFlowExecutionListener with this:
public class AvoidLeakJpaFlowExecutionListener extends JpaFlowExecutionListener {
public AvoidLeakJpaFlowExecutionListener(EntityManagerFactory entityManagerFactory, PlatformTransactionManager transactionManager) {
super(entityManagerFactory, transactionManager);
}
#Override
public void paused(RequestContext context) {
super.paused(context);
EntityManager entityManager = (EntityManager) context.getFlowScope().get(PERSISTENCE_CONTEXT_ATTRIBUTE);
if (entityManager != null && entityManager instanceof HibernateEntityManager) {
HibernateEntityManager hibernateEntityManager = (HibernateEntityManager) entityManager;
hibernateEntityManager.getSession().disconnect();
}
}
}
This approach solves the lazily loaded collections problem but will still leak connections when loading of lazy-initialized entities is done using WebFlow's persistence context and this loading is performed during the transition to subflow that does not have configured. as described in in this bug report (where I found this solution as well).

Related

How to use Spring AbstractRoutingDataSource with dynamic datasources?

I am working in a project using Spring, Spring Data JPA, Spring Security, Primefaces...
I was following this tutorial about dynamic datasource routing with spring.
In this tutorial, you can only achieve dynamic datasource switching between a pre-defined datasources.
Here is a snippet of my code :
springContext-jpa.xml
<bean id="dsCgWeb1" class="org.apache.commons.dbcp.BasicDataSource">
<property name="driverClassName" value="${jdbc.driverClassName.Cargest_web}"></property>
<property name="url" value="${jdbc.url.Cargest_web}"></property>
<property name="username" value="${jdbc.username.Cargest_web}"></property>
<property name="password" value="${jdbc.password.Cargest_web}"></property>
</bean>
<bean id="dsCgWeb2" class="org.apache.commons.dbcp.BasicDataSource">
// same properties, different values ..
</bean>
<!-- Generic Datasource [Default : dsCargestWeb1] -->
<bean id="dsCgWeb" class="com.cargest.custom.CargestRoutingDataSource">
<property name="targetDataSources">
<map>
<entry key="1" value-ref="dsCgWeb1" />
<entry key="2" value-ref="dsCgWeb2" />
</map>
</property>
<property name="defaultTargetDataSource" ref="dsCgWeb1" />
</bean>
What i want to do is to make the targetDataSources map dynamic same as its elements too.
In other words, i want to fetch a certain database table, use properties stored in that table to create my datasources then put them in a map like targetDataSources.
Is there a way to do this ?
Nothing in AbstractRoutingDataSource forces you to use a static map of DataSourceS. It is up to you to contruct a bean implementing Map<Object, Object>, where key is what you use to select the DataSource, and value is a DataSource or (by default) a String referencing a JNDI defined data source. You can even modify it dynamically since, as the map is stored in memory, AbstractRoutingDataSource does no caching.
I have no full example code. But here is what I can imagine. In a web application, you have one database per client, all with same structure - ok, it would be a strange design, say it is just for the example. At login time, the application creates the datasource for the client and stores it in a map indexed by sessionId - The map is a bean in root context named dataSources
#Autowired
#Qualifier("dataSources");
Map<String, DataSource> sources;
// I assume url, user and password have been found from connected user
// I use DriverManagerDataSource for the example because it is simple to setup
DataSource dataSource = new DriverManagerDataSource(url, user, password);
sources.put(request.getSession.getId(), dataSource);
You also need a session listener to cleanup dataSources in its destroy method
#Autowired
#Qualifier("dataSources");
Map<String, DataSource> sources;
public void sessionDestroyed(HttpSessionEvent se) {
// eventually cleanup the DataSource if appropriate (nothing to do for DriverManagerDataSource ...)
sources.remove(se.getSession.getId());
}
The routing datasource could be like :
public class SessionRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
HttpServletRequest request = ((ServletRequestAttributes)
RequestContextHolder.getRequestAttributes()).getRequest();
return request.getSession().getId();
}
#Autowired
#Qualifier("dataSources")
public void setDataSources(Map<String, DataSource> dataSources) {
setTargetDataSources(dataSources);
}
I have not tested anything because it would be a lot of work to setting the different database, but I thing that it should be Ok. In real world there would not be a different data source per session but one per user with a count of session per user but as I said it is an over simplified example.
The datasource used by a thread might change from time to time.
Should pay attention to concurrency, applications might get concurrency issues in concurrent environment.
thread-bound AbstractRoutingDataSource sample
It can be achieved with AbstractRoutingDataSource and keeping the information in the thread-local Variable. Here is a beautiful working example you can refer to:
Multi-tenancy: Managing multiple datasources with Spring Data JPA

Is it possible to use SpringData-JPA with a hibernate4.LocalSessionFactoryBean?

I am already using Hibernate 4 directly with a LocalSessionFactoryBean and a SessionFactory in my code.
I would now like to include Spring-Data-JPA in my code.
But Spring-Data needs an EntityManagerFactory to work, which can be configured through a LocalContainerEntityManagerFactoryBean. Can these Beans LocalSessionFactoryBean and LocalContainerEntityManagerFactoryBean coexist in one Spring project?
(Or can one be adapted by the other?)
What is the best practice?
Although they can coexists it will be problematic especially if you want to have them participate in the same transaction. However if you switch your logic around and configure a LocalContainerEntityManagerFactoryBean instead of a LocalSessionFactoryBean you can use the HibernateJpaSessionFactoryBean to get access to the underlying SessionFactory.
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<!-- Your properties here -->
</bean>
<bean id="sessionFactory" class="org.springframework.orm.jpa.vendor.HibernateJpaSessionFactoryBean">
<property name="entityManagerFactory" ref="entityManagerFactory" />
</bean>
Now you have both and can participate in the same transaction.
This solution is also documented in the Spring Data JPA reference guide in the FAQ section.
#Autowired
private EntityManager entitymanager;
public List<SpBooking> list() {
// #SuppressWarnings("unchecked")
System.out.println("******************************");
// #SuppressWarnings("unchecked")
List<SpBooking> listUser = (List<SpBooking>)((Session)entitymanager.getDelegate())
.createCriteria(SpBooking.class)
.list();
for (SpBooking i:listUser)
System.out.println("------------------"+i.getBookingId());
return listUser;
}
And as of JPA 2.1, EntityManagerFactory.unwrap(java.lang.Class) provides a nice approach, documented here: https://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/orm/jpa/vendor/HibernateJpaSessionFactoryBean.html
#Bean
public SessionFactory sessionFactory(#Qualifier("entityManagerFactory") EntityManagerFactory emf) {
return emf.unwrap(SessionFactory.class);
}

Spring configuration for multiple Activemq remote brokers

How to configure multiple remote activemq brokers (different IP address) in spring context? Below is the configuration for 1 remote broker. I am using camel to create routes that produce and consume messages from and to different queues in multiple remote brokers. Based on the following routes, how do the system knows which remote broker each queue belongs to?
List item
from("direct:start").to("activemq:queue:outgoingRequests")
List item
from("activemq:queue:incomingOrders").to("log:Events?
showAll=true").to("bean:jmsService")
Spring context for 1 broker
org.camel.routes
<bean id="jmsConnectionFactory" class="org.apache.activemq.ActiveMQConnectionFactory">
<property name="brokerURL" value="tcp://10.1.11.97:61616" />
</bean>
<bean id="pooledConnectionFactory"
class="org.apache.activemq.pool.PooledConnectionFactory" init-
method="start" destroy-method="stop">
<property name="maxConnections" value="8" />
<property name="connectionFactory" ref="jmsConnectionFactory" />
</bean>
<bean id="jmsConfig" class="org.apache.camel.component.jms.JmsConfiguration">
<property name="connectionFactory" ref="pooledConnectionFactory"/>
<property name="concurrentConsumers" value="10"/>
</bean>
<bean id="activemq" class="org.apache.activemq.camel.component.ActiveMQComponent">
<property name="configuration" ref="jmsConfig"/>
</bean>
Just add more components with different names
<bean id="activemq" class="org.apache.activemq.camel.component.ActiveMQComponent">
<property name="configuration" ref="jmsConfig"/>
</bean>
<bean id="activemq2" class="org.apache.activemq.camel.component.ActiveMQComponent">
<property name="configuration" ref="myOtherJmsConfig"/>
</bean>
Then simply use the names:
<from uri="activemq:queue:MY.QUEUE"/><!-- get from "1st" ActiveMQ -->
<to uri="activemq2:queue:MY.QUEUE"/> <!-- put to same queue name on other ActiveMQ -->
Actually, you can call them whatever you want, like "EuropeanMarketBroker" or whatever fits in.
I have been trying to achieve this with the difference that my spring configuration is not in xml. It is helpful to know that you can achieve the same outcome by using spring annotations in a few ways.
The key to achieving this is registering the component with the desired name.
For example:
camelContext.addComponent("activemq2", jmsComponentInstance);
There is two ways of achieving this. Namely by creating two beans with qualifiers which identifies them from each other and then wiring those beans and registering them as components. Alternatively (this is preferable) you can create the bean and register the component all at once. Below are examples of both:
1 - Create Bean and Register elsewhere
#Configuration
public class ClassA{
#Bean #Qualifier("activemq2") public JmsComponent createJmsComponent(){
return JmsComponent.jmsComponentAutoAcknowledge(..);//Initialise component from externalised configs
}
}
#Component
public class ClassB{
#Autowired private CamelContext camelContext;
#Autowired #Qualifier("activemq2")
private JmsComponent jmsComponent;
public void someMethod(){
camelContext.addComponent("activemq2", jmsComponent);
}
}
2 - Create Bean and Register in one place within your #Configuration bean.
#Bean #Autowired public JmsComponent createJmsComponent(CamelContext camelContext){
JmsComponent component = JmsComponent.jmsComponentAutoAcknowledge(..);//Initialise component from externalised configs
camelContext.addComponent("activemq2", component);//Add Component to camel context
return component;//Return component instance
}
I addition of the two answers, here is my working solution with the latest SpringBoot using dedicated properties for both broker:
First, I define two Beans for each ConnectionFactory:
// gatewayRouterProperties is a java `record` mapped to the application.yml property file.
// One ConnectionFactory for the onPremise broker
#Bean
public ConnectionFactory jmsConnectionFactoryOnPrem() {
ActiveMQConnectionFactory activeMQConnectionFactory = new ActiveMQConnectionFactory();
activeMQConnectionFactory.setBrokerURL(gatewayRouterProperties.activeMq().brokerOnPrem().url());
activeMQConnectionFactory.setUserName(gatewayRouterProperties.activeMq().brokerOnPrem().user());
activeMQConnectionFactory.setPassword(gatewayRouterProperties.activeMq().brokerOnPrem().pass());
return activeMQConnectionFactory;
}
// Another broker ConnectionFactory for the cloud AWS broker
#Bean
public ConnectionFactory jmsConnectionFactoryAws() {
ActiveMQConnectionFactory activeMQConnectionFactory = new ActiveMQConnectionFactory();
activeMQConnectionFactory.setBrokerURL(gatewayRouterProperties.activeMq().brokerAws().url());
activeMQConnectionFactory.setUserName(gatewayRouterProperties.activeMq().brokerAws().user());
activeMQConnectionFactory.setPassword(gatewayRouterProperties.activeMq().brokerAws().pass());
return activeMQConnectionFactory;
}
And then I just define the two Beans ActiveMQComponent (the same as Peter's answer but using annotations):
#Bean(name = "activemq")
public ActiveMQComponent createActiveMQComponentOnPrem() {
ActiveMQConfiguration amqConfig = new ActiveMQConfiguration();
amqConfig.setConnectionFactory(jmsConnectionFactoryOnPrem());
return new ActiveMQComponent(amqConfig);
}
#Bean(name = "activemq2")
public ActiveMQComponent createActiveMQComponentAws() {
ActiveMQConfiguration amqConfig = new ActiveMQConfiguration();
amqConfig.setConnectionFactory(jmsConnectionFactoryAws());
return new ActiveMQComponent(amqConfig);
}
Note that I am using the bean name attribute and no need to add that manually in CamelContext.
After in my Camel route I just use my activemq components beans like this:
// 'activemq' component => AMQ on-prem
// 'activemq2' component => AMQ AWS
from("activemq:queue:QUEUE.TO.SYNC.TO.AWS")
.routeId("gw-router-route-on-prem-to-aws")
.autoStartup("{{autostart-enabled}}")
.to("activemq2:queue:QUEUE.FROM.ON.PREM")
;

Injection of autowired dependencies failed while using #Transactional

I testing my DAO, but it didn't work. The following error occurs:
Tests in error:
testAccountOperations(com.tsekhan.rssreader.dao.HibernateControllerTest): Error creating bean with name 'com.tsekhan.rssreader.dao.HibernateControllerTest': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: com.tsekhan.rssreader.dao.HibernateController com.tsekhan.rssreader.dao.HibernateControllerTest.hibernateController; nested exception is java.lang.IllegalArgumentException: Can not set com.tsekhan.rssreader.dao.HibernateController field com.tsekhan.rssreader.dao.HibernateControllerTest.hibernateController to $Proxy25
My DAO:
#Service
#Scope("singleton")
public class HibernateController extends HibernateDaoSupport {
#Autowired
public SessionFactory sessionFactory;
#Transactional
public void addAcount(Account account) {
sessionFactory.getCurrentSession().saveOrUpdate(account);
}
}
My test for this DAO:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("classpath:/applicationContext.xml")
public class HibernateControllerTest {
#Autowired
HibernateController hibernateController;
private Set<Channel> getTestChannelList(String channelLink) {
Channel testChannel = new Channel();
testChannel.setSourceLink(channelLink);
Set<Channel> testChannelList = new HashSet<Channel>();
testChannelList.add(testChannel);
return testChannelList;
}
private Account getTestAccount(String accountLogin, String channelLink) {
Account testAccount = new Account();
testAccount.setAccountLogin(accountLogin);
testAccount.setChannelList(getTestChannelList(channelLink));
return testAccount;
}
#Test
public void testAccountOperations() {
hibernateController
.addAcount(getTestAccount("test_login", "test_link"));
}
}
My applicationContext.xml:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:tx="http://www.springframework.org/schema/tx"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
http://www.springframework.org/schema/tx
http://www.springframework.org/schema/tx/spring-tx-3.1.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-3.1.xsd"
default-autowire="byName">
<!-- Enabling spring-transaction annotations -->
<tx:annotation-driven transaction-manager="transactionManager"/>
<!-- Enabling annotation-driven configurating -->
<context:annotation-config />
<!-- Creation of transaction manager -->
<bean id="transactionManager" scope="singleton"
class="org.springframework.orm.hibernate3.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory"/>
</bean>
<bean id="sessionFactory" scope="singleton"
class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
<property name="configLocation" value="classpath:/hibernate.cfg.xml"/>
<property name="configurationClass">
<value>org.hibernate.cfg.AnnotationConfiguration</value>
</property>
</bean>
<!--
A Spring interceptor that takes care of Hibernate session lifecycle.
-->
<bean id="hibernateInterceptor"
class="org.springframework.orm.hibernate3.HibernateInterceptor">
<property name="sessionFactory">
<ref bean="sessionFactory"/>
</property>
</bean>
<bean name="employeeDAO" scope="prototype"
class="com.tsekhan.rssreader.dao.HibernateController" />
<!-- Searching for hibernate POJO files in package com.tsekhan.rssreader.web -->
<context:component-scan base-package="com.tsekhan.rssreader.web" />
<bean class="org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping" />
<bean class="org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter" />
</beans>
I note, that if you comment #Transactional in DAO, bean is created correctly. What happens?
First of all its realy bad to give name ending in Controller to a DAO its very confusing, Controller and DAO have all together different purpose.
When you add #Transactional to a service or dao class, for spring to make it work in a transaction needs to create a proxy of that class, its a kind of wrapper where in before the execution of proxied class(class in consideration which is proxied) method spring starts the transaction and after the execution in case no exceptions completes the transaction, this can be done in spring via AOP and Annotations. To describe in code.
public class OriginalDaoImpl implements OriginalDao extends DaoSupport {
public void save(Object o){
manager.save(o);
}
}
public class ProxyDaoImpl implements OriginalDao {
private OriginalDao originalDaoImpl; //instance of OriginalDaoImpl
public void save(Object o){
try{
transaction.start();
originalDaoImpl.save(o);
transaction.commit();
}catch(Exception e){
transaction.rollback();
}finally{
//clean up code
}
}
}
As you see this is not an exact implementation but a foundation code, how transaction magically works for you. The key point is there interface OriginalDao which makes this injection easy as OriginalDaoImpl and ProxyDaoImpl both implement same interface. Hence they can be swapped i.e. proxy taking place of original. This dynamic proxy can be created in java by Java dynamic proxy. Now, the question what if your class is not implementing an interface, it gets harder for the replacement to happen.
One of the libraries CGLIB as far as I know, helps in such a scenario, whereby it generates a dynamic subclass for the class in consideration and in overriden method performs the magic as described above, by calling super.save(o) to delegate to original code.
Now to the problem of injection.
Create interface and make your dao implement that and spring will default to JDK proxy as it is behaving now.
Add proxy-target-class="true" attribute to <tx:annotation-driven transaction-manager="transactionManager"/>
As far as exception is concerned it is throwing as it is expecting injected bean to be of type 'HibernateController' but its not.
For you reference you can refer links below.
10.5.6 Using #Transactional
Spring AOP Doc
Hope this helps !!!!!.
If your are using Spring MVC make sure to scan specific controller classes alone in servlet context file. Otherwise it will scan 2 times and transaction is not available on the application context.

Ability to switch Persistence Unit dynamically within the application (JPA)

My application data access layer is built using Spring and EclipseLink and I am currently trying to implement the following feature - Ability to switch the current/active persistence unit dynamically for a user. I tried various options and finally ended up doing the following.
In the persistence.xml, declare multiple PUs. Create a class with as many EntityManagerFactory attributes as there are PUs defined. This will act as a factory and return the appropriate EntityManager based on my logic
public class MyEntityManagerFactory {
#PersistenceUnit(unitName="PU_1")
private EntityManagerFactory emf1;
#PersistenceUnit(unitName="PU_2")
private EntityManagerFactory emf2;
public EntityManager getEntityManager(int releaseId) {
// Logic goes here to return the appropriate entityManeger
}
}
My spring-beans xml looks like this..
<!-- First persistence unit -->
<bean class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean" id="emFactory1">
<property name="persistenceUnitName" value="PU_1" />
</bean>
<bean class="org.springframework.orm.jpa.JpaTransactionManager" id="transactionManager1">
<property name="entityManagerFactory" ref="emFactory1"/>
</bean>
<tx:annotation-driven transaction-manager="transactionManager1"/>
The above section is repeated for the second PU (with names like emFactory2, transactionManager2 etc).
I am a JPA newbie and I know that this is not the best solution. I appreciate any assistance in implementing this requirement in a better/elegant way!
Thanks!
First of all thanks to user332768 and bert. I tried using AbstractRoutingDataSource as mentioned in the link provided by bert, but got lost trying to hook up my jpa layer (eclipselink). I reverted to my older approach with some modifications. The solution looks cleaner (IMHO) and is working fine. (switching database at runtime and also writing to multiple databases in the same transaction)
public class MyEntityManagerFactoryImpl implements MyEntityManagerFactory, ApplicationContextAware {
private HashMap<String, EntityManagerFactory> emFactoryMap;
public EntityManager getEntityManager(String releaseId) {
return SharedEntityManagerCreator.createSharedEntityManager(emFactoryMap.get(releaseName));
}
#Override
public void setApplicationContext(ApplicationContext applicationContext)
throws BeansException {
Map<String, LocalContainerEntityManagerFactoryBean> emMap = applicationContext.getBeansOfType(LocalContainerEntityManagerFactoryBean.class);
Set<String> keys = emMap.keySet();
EntityManagerFactory entityManagerFactory = null;
String releaseId = null;
emFactoryMap = new HashMap<String, EntityManagerFactory>();
for (String key:keys) {
releaseId = key.split("_")[1];
entityManagerFactory = emMap.get(key).getObject();
emFactoryMap.put(releaseId, entityManagerFactory);
}
}
}
I now inject my DAO's with an instance (singleton) of MyEntityManagerFactoryImpl. The dao will then simply call createSharedEntityManager with the required release and will get the correct EntityManager for that database. (Note that i am now using application managed EntityManager and hence i have to explicitly close them in my dao)
I also moved to jta transaction manager (to manage transaction across multiple databases)
This is how my spring xml looks like now.
...
<bean class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean" id="em_Rel1">
<property name="persistenceUnitName" value="PU1" />
</bean>
<bean class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean" id="em_Rel2">
<property name="persistenceUnitName" value="PU2" />
</bean>
<bean class="org.springframework.transaction.jta.JtaTransactionManager" id="jtaTransactionManager">
</bean>
<tx:annotation-driven transaction-manager="jtaTransactionManager"/>
....
Cheers! (comments are welcome)
I am not sure if this is a clean method. Instead of declaring the enitiymanagerfactory multiple times, we can use the spring application context to get the entitymanagerfactory declared in the spring application.xml.
hm = applicationContext.getBeansOfType(org.springframework.orm.jpa.LocalEntityManagerFactoryBean.class);
EntityManagerFactory emf = ((org.springframework.orm.jpa.LocalEntityManagerFactoryBean) hm.get("&emf1")).getNativeEntityManagerFactory();
EntityManagerFactory emf2 = ((org.springframework.orm.jpa.LocalEntityManagerFactoryBean) hm.get("&emf2")).getNativeEntityManagerFactory();
This is something i need to do in the future too, for this i have bookmarked Spring DynamicDatasourceRouting
http://blog.springsource.com/2007/01/23/dynamic-datasource-routing/
As far as i understand, this is using one PU, which gets assigned different DataSources. Perhaps it is helpful.

Resources