I need to implement JTA transaction management using spring jdbc in liferay. I have two databases connected in liferay using jndi. In my project I am doing jdbc CRUD operation using jdbc spring dao and for liferay, it is through liferay build-in service api. For any exception, rollback is working fine for other db but it is not working for liferay.
Below is my code sample:
In portal-ext.prop I have defined
transaction.management.impl= <JTATransactionmanager>
In my project's context.xml of tomcat I have defined user transaction as mentioned at liferay's site :
context.xml
`<Resource name="UserTransaction" auth="Container" type="javax.transaction.UserTransaction" />
<Transaction factory="org.objectweb.jotm.UserTransactionFactory" jotm.timeout="600" /><br>
<Resource auth="Container" type="javax.sql.DataSource" factory="org.objectweb.jotm.datasource.DataSourceFactory" driverClassName="<postgresqldriver>" name="jdbc/LiferayPool" username="root" password="" url="jdbc:mysql://localhost/lportal?useUnicode=true&characterEncoding=UTF-8&useFastDateParsing=false" />}}}<br>
<Resource auth="Container" type="javax.sql.DataSource" factory="org.objectweb.jotm.datasource.DataSourceFactory" driverClassName="<postgresqldriver>" name="jdbc/test" username="root" password="" url="jdbc:mysql://localhost/test?useUnicode=true&characterEncoding=UTF-8&useFastDateParsing=false" />}}}
`
In bean.xml of my project :
<tx:annotation-driven transaction-manager="txManager"/>
<beans:bean id="dbDataSource" class="org.springframework.jndi.JndiObjectFactoryBean">
<beans:property name="jndiName" value="java:comp/env/jdbc/test"/>
</beans:bean>
<bean id="txManager" class="org.springframework.transaction.jta.JtaTransactionManager" />
in my class:
#Transactional(rollback{myexception.class})
public void test()
first of all put an eye on this discussion.
What you wouldn't forget is that Liferay rollbacks both for PortalException or SystemException... but it rollbacks only the beans injected inside the current transaction context.
This means that you shouldn't use XLocalServiceUtil, but the injected xLocalService bean.
To get it you need to advice your service layer by service.xml declarations or, inside your serviceImpl class, by adding and referring to:
#BeanReference(type=XArticleLocalService.class)
protected XArticleLocalService xArticleLocalService;
I hope this can help you.
I believe you are using the XXXServiceUtil.java. Whenever you use ServiceUtil, any method you call executes under a different transaction manager (regardless of current thread transaction state) as it's a completely different class loader. Remember in Liferay, every portlet/plugin has its own class loader.
Related
I am very new to Spring Integration and would like to use a queue channel which is backed by a JdbcChannelMessageStore.
As our project uses Spring Boot with Spring Data JPA, I would like to have an integration-context.xml configuration where the existing data base connection is reused. However I struggle to make it work.
Unfortunately I cannot find any example projects where JdbcChannelMessageStore is used. Could anyone provide some good example implementations for this?
Thanks a lot in advance.
P.S.: Here is my last integration-context.xml version:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xsi:schemaLocation="http://www.springframework.org/schema/integration https://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/beans https://www.springframework.org/schema/beans/spring-beans.xsd">
<int:channel id="outgoingChannel">
<int:queue message-store="outgoingMessageChannelStore"/>
</int:channel>
<bean id="dp"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="url" value="${spring.datasource.password}" />
<property name="username" value="${spring.datasource.username}" />
<property name="password" value="${spring.datasource.password}" />
</bean>
<bean id="outgoingMessageChannelStore" class="org.springframework.integration.jdbc.store.JdbcChannelMessageStore">
<property name="dataSource" ref="dp"/>
<property name="channelMessageStoreQueryProvider" ref="jdbcChannelMessageStoreQueryProvider"/>
<property name="region" value="TX_TIMEOUT"/>
<property name="usingIdCache" value="true"/>
</bean>
<bean id="jdbcChannelMessageStoreQueryProvider" class="org.springframework.integration.jdbc.store.channel.OracleChannelMessageStoreQueryProvider" />
<int:transaction-synchronization-factory id="jdbcChannelMessageStoreFactory">
<int:after-commit expression="#jdbcChannelMessageStore.removeFromIdCache(headers.id.toString())" />
<int:after-rollback expression="#jdbcChannelMessageStore.removeFromIdCache(headers.id.toString())" />
</int:transaction-synchronization-factory>
</beans>
With this I am getting the following Exception at startup:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Invocation of init method failed; nested exception is org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
...
Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set
Well, since you say that you also use Spring Boot, you probably miss the fact that it auto-configure a DataSource bean for us. Having that dp bean in your config it just neglects that auto-configuration and tries to apply it everywhere you need DataSource, like that Hibernate auto-configuration.
What you really need is exactly opposite - you need to reuse an auto-configured DataSource for this Spring Integration config. Of course, if your JdbcChannelMessageStore is going to rely on the same data base as menitoned Spring Data JPA.
So, what you need is just remove that dp bean definition and use a dataSource name for the <property name="dataSource" in the outgoingMessageChannelStore bean definition.
Some remarks:
We don't need usingIdCache with Oracle and therefore no need in that jdbcChannelMessageStoreFactory to deal with the cache. And that even covered in JavaDocs:
* <p>If using the provided
* {#link org.springframework.integration.jdbc.store.channel.OracleChannelMessageStoreQueryProvider},
* don't set {#link #usingIdCache}
* to true, as the Oracle query will ignore locked rows.</p>
Try to configure Spring Integration with Java & Annotation Configuration (or even Java DSL). This way you won't be tied with a bean name (like that dataSource) and just will have a bean method argument injection for plain DataSource type and Spring container will inject for you an auto-configured bean.
I am fairly new to spring batch. I have it working embedded inside my spring mvc web application. It is using a db2 database data store to log job information. At present my batch-db2.properties reads as...
batch.jdbc.driver=com.ibm.db2.jcc.DB2Driver
batch.jdbc.url=<<my database url>>
batch.jdbc.user=<<my database user>>
batch.jdbc.password=<<my database password>>
batch.schema=<<my database scehma>>
batch.jndi.name=jdbc/<<my jndi url>>
So I've set both jdbc properties and well as the jndi property. The jobs are running fine but my question is what type of connection is my spring batch installation using.
If both are set does it use jdbc or jndi? Also can someone point me to the spring batch documentation page where it gives more information about these settings. I could not find it.
Here is my data source configuration....
<beans profile="server">
<bean id="ePosDataSource"
class="org.springframework.jndi.JndiObjectFactoryBean"
p:jndiName="java:comp/env/jdbc/reportingManagerDataSource"
p:lookupOnStartup="false"
p:cache="true"
p:proxyInterface="javax.sql.DataSource"/>
<jee:jndi-lookup id="ePosDataSource"
jndi-name="jdbc/reportingManagerDataSource"
cache="true"
resource-ref="true"
lookup-on-startup="false"
proxy-interface="javax.sql.DataSource"/>
<bean id="custDbDataSource"
class="org.springframework.jndi.JndiObjectFactoryBean"
p:jndiName="java:comp/env/jdbc/reportingManagerCustDbDataSource"
p:lookupOnStartup="false"
p:cache="true"
p:proxyInterface="javax.sql.DataSource" />
<jee:jndi-lookup id="custDbDataSource"
jndi-name="jdbc/reportingManagerCustDbDataSource"
cache="true"
resource-ref="true"
lookup-on-startup="false"
proxy-interface="javax.sql.DataSource"/>
</beans>
thanks!
I am trying t learn various technologies - want to do some UI stuff using HTML5 with Jquery
In the UI I am making a call to a RESTFUL webservice ( Jersey ) which generates JSON output
In my webservice - I had stubbed the data that was supposed to come from the database ( mysql )
Now I want to learn Spring JDBC template ( and not use plain JDBC )
So my question is about using spring ( only spring jdbc and not spring mvc ) in a web application which hosts my RESTFUL webservice
I want to use spring jdbc template - so have written a spring xml file where I am creating the necessary configurations
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="....."
xsi:schemaLocation="....">
<bean id="examDAO" class="com.examscripts.mockexam.repository.ExamRepositoryImpl">
<property name="ds" ref="ds" />
</bean>
<bean id="ds" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="com.mysql.jdbc.Driver" />
<property name="url" value="jdbc:mysql://localhost/mockexam" />
<property name="username" value="xxx" />
<property name="password" value="yyy" />
</bean>
To load these in I added the following in web.xml:
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>classpath:spring.xml</param-value>
</context-param>
<listener>
<listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
on tomcat startup I can see that the file is loaded :
Loading XML bean definitions from class path resource ....
Now my question is in my service layer where I need to communciate with DAO layer ( jdbc ) - do have to create the actual bean using something like this :
ClassPathXmlApplicationContext ctx = new ClassPathXmlApplicationContext("spring.xml");
// instantiate our spring dao object from the application context
ExamRepositoryImpl impl= (ExamRepositoryImpl)ctx.getBean("examDAO");
I am completely new to spring and googling seems to always show spring mvc web apps or standalone spring examples - my case is a web app minus spring mvc but using spring jdbc template
any pointers ?
Loading XML bean definitions from class path resource ....
You're on the right track. Spring is loading your XML file (when you deploy the app) and instantiating the beans that you declared. Crank up the logging for org.springframework to get more details.
Now my question is in my service layer where I need to communciate
with DAO layer ( jdbc ) - do have to create the actual bean using
something like this :
No, definitely not. The entire purpose of using a dependency injection framework like Spring is to avoid writing code like this.
Declare a setter in your service class that takes the DAO as an argument.
public void setExamDAO(ExamDao examDao) {
this.examDao = examDao;
}
Add a new XML bean definition for your service class in your XML file and create a new <property> that wires in the DAO:
<bean id="yourservice" class="com.foo.YourServiceClass">
<property name="examDao" ref="examDAO"/>
</bean>
We have developed a data persistence framework using Mybatis. The framework uses plain MyBatis APIs. (We were prohibited from using any mybatis-spring, do not ask… why?)
Now we have to integrate this persistence framework with another framework developed by other teams. This other framework heavily uses spring transactions for everything. Our persistent framework DAOs will be used by this framework within its own API ….that means the spring managed transactions will be propagated to MyBatis DAO. It is expected that our MyBatis based persistence framework should participate in spring managed transactions without any issues.
There are two options for us to make this work
(1)Change our persistent framework to use mybatis-spring module. Change DAOs to use mappers directly injected using spring and spring’s SqlSessionFactoryBean. I did build a small example simulating both the frameworks and everything works without any issue. The problem is with this approach that it requires changing almost all the DAOs to use spring injected mapper, extensively test the framework again. We simply do not have time available due to delivery timeline.
(2)Use mybatis-spring, define SqlSeeionFactory using spring – set the datasource and transaction manager used by other framework. Something like
<bean id="smpDataSource" class="oracle.jdbc.pool.OracleDataSource" destroy-method="close">
<property name="connectionCachingEnabled" value="true" />
<property name="URL"> <value>${db.thin.url}</value></property>
<property name="user"> <value>${db.user}</value></property>
<property name="password"><value>${db.password}</value>
</property>
</bean>
<bean id="dbTransactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="smpDataSource" />
</bean>
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="smpDataSource" />
<property name="typeAliasesPackage" value="spike.smp51.domain" />
<property name="mapperLocations" value="classpath*:spike/smp51/mappers/*.xml" </bean>
Then in applicataion code MyBatis DAO gets the sqlseesionfactory from spring like
public static SqlSessionFactory getSqlSessionFactory() throws Exception
{
DefaultSqlSessionFactory sessionFactory = (DefaultSqlSessionFactory)ctx.getBean("sqlSessionFactory");
return sessionFactory;
}
All DAOs already use SqlSeesionFactory to open and close sessions. Just replace that mybatis created sqlseeionfactory with spring created sqlseeionfactory. That way we will have only few lines of changes.
This approach is outlined here
http://mybatis.github.io/spring/using-api.html
The mybatis documentation warns about this approach – specifically that it will not participate in spring transactions.
When I tried the 2nd approach, our framework was able to participate in spring transactions. This is strange. Is the MyBatis documentation incorrect then? I did verify it extensively by creating various transaction boundaries using spring transactions + AOP . MyBatis DAOs are able to participate in spring managed transactions every time. Since this second approach will save us 90% of the development time – we really like to use it – but worried since MyBatis warns following this approach. Has anyone tried this approach? Any feedback is greatly appreciated.
Did you have any return on that ?
I'm wondering if in the doc they're talking about org.apache.ibatis.session.SqlSessionFactory from Mybatis-api while the SqlSessionFactory you're using is from the mybatis-spring lib : org.mybatis.spring.SqlSessionFactoryBean
I am having multiple datasource and one one database configured with JPA. I am using websphere 7. I want all these datasouces to be configured as global transactions. I am using below spring configurations but the transactions are not working as expected global transaction. If one db is failing then the other db is getting commited which is not expected as single global transactions. Can you please help me where i m doing incorrect,
I am having 2 datasouce one as configured below with id="us_icfs_datasource" and another using JPA
<jee:jndi-lookup id="entityManagerFactory" jndi-name="persistence/persistenceUnit"/>
<bean id="pabpp" class="org.springframework.orm.jpa.support.PersistenceAnnotationBeanPostProcessor"/>
<bean id="transactionManager" class="org.springframework.transaction.jta.JtaTransactionManager" />
<!-- Needed for #Transactional annotation -->
<tx:annotation-driven/>
<jee:jndi-lookup id="US_ICFS_DATASORCE"
jndi-name="jdbc/financing_tools_docgen_txtmgr"
cache="true"
resource-ref="true"
proxy-interface="javax.sql.DataSource" />
also I have added below code in web.xml
<persistence-unit-ref>
<persistence-unit-ref-name>persistence/persistenceUnit</persistence-unit-ref-name>
<persistence-unit-name>persistenceUnit</persistence-unit-name>
</persistence-unit-ref>
<persistence-context-ref>
<persistence-context-ref-name>persistence/persistenceUnit</persistence-context-ref-name>
<persistence-unit-name>persistenceUnit</persistence-unit-name>
</persistence-context-ref>
below is my code where i m using transaction
> #Transactional public TemplateMapping addTemplateMapping(User user,
> TemplateMapping templateMapping) throws
> TemplateMappingServiceException { .... }
On Websphere you should use this bean to hook into the Websphere transaction manager:
<bean id="transactionManager"
class="org.springframework.transaction.jta.WebSphereUowTransactionManager"/>
See also this article
EDIT:
In order to use 2-phase commit (i.e. ensuring consistency across multiple resources), you will need to use XA data sources. See this article for details.
First of all your data sources that participate in global transaction must be of javax.sql.XADataSource type.
You also have to set transaction type in your persistence unit to JTA (not RESOURCE_LOCAL).
And you need to inform your JPA implementation that you want to do global transactions.