Connection pool replacement for already implemented Spring Jdbctemplate project - spring

I am a doing a mid size project with spring jdbc and MsSQL server , project is almost 50% done , now when every request doing lots of inserts and updates specially with those tables which contains lots of columns and large datasets is performing very slow , and sometimes showing connection closed.
Now i am thinking to integrate C3p0 or similar connection pooling but i cant change any DAO code which i already done ..
I implemented a DAOHelper class with JDBCTemplate variable and injecting the JDBCTemplate dependency in applicationContext.xml with autowiring of DAOClass in controller class , and i extended this DAOHelper to all DAO classes and using this jdbcTemplate to do JDBC operations.
<bean id="ds" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver"/>
<property name="url" value="jdbc:sqlserver://192.168.1.101:1433;databaseName=OrderManager"/>
<property name="username" value="sa"/>
<property name="password" value="520759"/>
</bean>
<bean id="JdbcDataSource" class="org.springframework.jdbc.core.JdbcTemplate">
<property name="dataSource" ref="ds"/>
</bean>
<bean id="OrderDAO" class="com.ordermanager.order.dao.OrderDAO" >
<property name="jdbcTemplate" ref="JdbcDataSource"/>
<property name="transactionManager" ref="transactionManager"/>
</bean>
#Controller
public class OrderController {
#Autowired
OrderDAO orderDAO;
#RequestMapping(value = "/addNewItem", method = RequestMethod.GET)
public ModelAndView addItem(#RequestParam("ParamData") JSONObject paramJson) {
ApplicationContext ctx = new FileSystemXmlApplicationContext(ConstantContainer.Application_Context_File_Path);
OrderDAO orderDAO = (OrderDAO) ctx.getBean("OrderDAO");
return new ModelAndView("MakeResponse", "responseValue", orderDAO.addItem(paramJson));
}
public class DAOHelper {
private JdbcTemplate jdbcTemplate;
private PlatformTransactionManager transactionManager;
public PlatformTransactionManager getTransactionManager() {
return transactionManager;
}
public void setTransactionManager(PlatformTransactionManager txManager) {
this.transactionManager = txManager;
}
public JdbcTemplate getJdbcTemplate() /*I am using this Method for all JDBC Task*/ {
return jdbcTemplate;
}
public void setJdbcTemplate(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
Now with minimal code changes how can i integrate C3p0 or any good connection pooling library with my already written code.

Just change the ds bean in your config xml with following and consider adding other c3p0 properties according to your own. make sure to have c3p0 jar in your classpath.
<bean id="ds" class="com.mchange.v2.c3p0.ComboPooledDataSource"
destroy-method="close">
<property name="driverClass" value="com.microsoft.sqlserver.jdbc.SQLServerDriver" />
<property name="jdbcUrl" value="jdbc:sqlserver://192.168.1.101:1433;databaseName=OrderManager" />
<property name="user" value="sa" />
<property name="password" value="520789" />
</bean>

Related

Spring batch doesnt start transaction when using custom writer

I am working on a spring batch which reads from a csv file and writes into database. i am using FlatFileItemReader for reading the file and implemented ItemWriter which uses Jpa to insert data into database. But batch fails with no transaction in progress.
Here is my job configuration
<bean id="datasource" class="oracle.jdbc.pool.OracleDataSource">
<property name="user" value="xxx" />
<property name="password" value="xx" />
<property name="URL" value="xxx" />
</bean>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
<bean id="entityManagerFactory" name="model"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="datasource"/>
<property name="jpaVendorAdapter">
<bean
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="generateDdl" value="false"/>
<property name="showSql" value="true"/>
<property name="database">
<util:constant
static-field="org.springframework.orm.jpa.vendor.Database.ORACLE"/>
</property>
<property name="databasePlatform" value="org.hibernate.dialect.Oracle12cDialect"/>
</bean>
</property>
</bean>
<batch:job id="testJob">
<batch:step id="step">
<batch:tasklet>
<batch:chunk reader="cvsFileItemReader" writer="databaseWriter"
commit-interval="10">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
And below is my writer
public class DatabaseWriter implements ItemWriter<Report> {
private EntityManagerFactory entityManagerFactory;
#Autowired
public DatabaseWriter(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
#Override
public void write(List<? extends Report> list) throws Exception {
EntityManager entityManager = entityManagerFactory.createEntityManager();
for (Report report : list) {
entityManager.persist(report );
}
entityManager.flush();
}
}
It only works if start transaction explicitly .that is like below
public class DatabaseWriter implements ItemWriter<Report> {
private EntityManagerFactory entityManagerFactory;
#Autowired
public DatabaseWriter(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
#Override
public void write(List<? extends Report> list) throws Exception {
EntityManager entityManager = entityManagerFactory.createEntityManager();
entityManager.getTransaction().begin();
for (Report report : list) {
entityManager.persist(report );
}
entityManager.getTransaction().commit();
}
}
Is it really needed that transaction should be maintained explicitly and spring batch doesn't control it out of box?
EDIT
I fixed it by changing writer like
public class DatabaseWriter implements ItemWriter<Report> {
#PersistenceContext
private EntityManager entityManager;
#Override
public void write(List<? extends Report> list) {
for (Report report : list) {
entityManager.persist(report );
}
}
Changing EntityMangerFactory to EntityManager with PersistenceContext fixed the problem. But i am struggling to understand reason for this behaviour
Create DAO and move your JPA code there and autowire the DAO in your writer.
Also make sure you autowire the entity manager like below.
#PersistenceContext
private EntityManager em;
But i am struggling to understand reason for this behaviour
The first approach does not work because you create a transaction manually while your code is actually running within the scope of a transaction driven by Spring Batch. So there will be two transactions with different contexts.
You should keep in mind that a tasklet is executed in a transaction driven by Spring Batch (including the item writer for a chunk-oriented tasklet). So any code running in that scope should conform to the transaction definition of the tasklet. In your case, the EntityManager injected in the writer is driven by the JpaTransactionManager you defined, which is also used by Spring Batch for the tasklet's transaction.
As a side note, you can use the JpaItemWriter provided by Spring Batch instead of writing a custom writer. The code is almost identical.

log4j2 JDBC appender with Spring

Log4j2 JDBC appender can be setup using a pooled connection factory that is defined using calls and method (see log4j2 Appenders):
<ConnectionFactory class="net.example.db.ConnectionFactory" method="getDatabaseConnection" />
Using Spring I have already a defined datasource that is providing a pooled connection :
<bean id="myDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName" value="${db_driver_class}" />
<property name="url" value="${db_jdbc_url}" />
<property name="username" value="${db_username}" />
<property name="password" value="${db_password}" />
<property name="initialSize" value="10" />
<property name="maxActive" value="100" />
<property name="maxIdle" value="50" />
<property name="minIdle" value="10" />
<property name="validationQuery" value="select 1" />
<property name="testOnBorrow" value="true" />
I would like to use the the Spring connection pool for the JDBC appender. Any idea how can this be done ?
thanks
Raz
I mange to create a 3-steps solution :
Define a bean in spring context that provide access to a data
source
Build an implementation of the bean that provide the
desired connection.
Build a static wrapper that can be accessed by the log4j JDBC appender.
1st step - bean declaration :
<bean id="springConnection" class="com.dal.entities.SpringConnection" scope="singleton">
<property name="dataSource" ref="myDataSource" />
the 2nd step - bean implementation - is also simple :
class SpringConnection {
private DataSource dataSource;
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
Connection getConnection() throws Exception {
return dataSource.getConnection();
}
}
the 3rd part - wrapper with static method - is a bit more complex:
public class SpringAccessFactory {
private final SpringConnection springCon;
private static ApplicationContext context;
private interface Singleton {
final SpringAccessFactory INSTANCE = new SpringAccessFactory();
}
private SpringAccessFactory() {
this.springCon = context.getBean(SpringConnection.class);
}
public static Connection getConnection() throws Exception {
return Singleton.INSTANCE.springCon.getConnection();
}
public static void setContext( ApplicationContext context) {
SpringAccessFactory.context = context;
}
}
There are - however - 2 issues I found so far:
You need to initialize the spring context and send it into the wrapper (SpringAccessFactory.setConetxt) before you start using the logger
initializing spring context early in program may trigger #PostConstruct methods (if any exists), before you plan to do so.....

How to configure multiple MyBatis datasources in Spring Boot?

With MyBatis-Spring-Boot-Starter, we can easily integrate MyBatis with Spring Boot, it works perfectly for one data source. However, now we'd like to add an extra data source in our project, unfortunately it seems not easy.
In MyBatis official documentation, I see the following content:
MyBatis-Spring-Boot-Starter will:
Autodetect an existing DataSource.
Will create and register an instance of a SqlSessionFactoryBean passing that DataSource as an input.
Will create and register an instance of a SqlSessionTemplate got out of the SqlSessionFactoryBean.
It looks like MyBatis-Spring-Boot-Starter supports only one data source at this moment. So, the question is how to configure multiple MyBatis datasources in Sping Boot?
You outlined 3 beans that are needed for MyBatis+Spring integration. These are automatically created for single data source.
If you need two data sources, you need to create 3 beans for each data source explicitly. So you'll be creating 6 beans (2 of type DataSource, 2 of type SqlSessionFactoryBean and 2 of type SqlSessionFactoryBean).
To bind DAO with certain datasource, you will need to use sqlSessionTemplateRef or sqlSessionFactoryRef parameter of #MapperScan annotation.
Also I don't recommend to go down the XML hell. I was using it this way in PROD, with two data sources, without any ugly XML configs on various projects. Also SQL queries were annotated.
Shame is that MyBatis documentation is not great and most examples out there are in XML.
Something this like this to your spring servlet.xml:
<bean id="db2dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName"><value>${db2.database.driver}</value></property>
<property name="url"><value>${db2.database.url}</value></property>
<property name="username"><value>${db2.database.username}</value></property>
<property name="password"><value>${db2.database.password}</value></property>
<property name="maxActive"><value>${db2.database.maxactiveconnections}</value></property>
<property name="maxIdle"><value>${db2.database.idleconnections}</value></property>
<property name="initialSize"><value>${db2.database.initialSize}</value></property>
</bean>
<bean id="db2SqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="db2dataSource" />
<property name="configLocation" value="/WEB-INF/mybatis-config.xml"/>
</bean>
<bean id="db2Dao" class="org.mybatis.spring.mapper.MapperFactoryBean">
<property name="sqlSessionFactory" ref="db2SqlSessionFactory"/>
<property name="mapperInterface" value="com.dao.db2Dao" />
</bean>
<bean id="oracledataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName"><value>${oracle.database.driver}</value></property>
<property name="url"><value>${oracle.database.url}</value></property>
<property name="username"><value>${oracle.database.username}</value></property>
<property name="password"><value>${oracle.database.password}</value></property>
<property name="maxActive"><value>${oracle.database.maxactiveconnections}</value></property>
<property name="maxIdle"><value>${oracle.database.idleconnections}</value></property>
<property name="initialSize"><value>${oracle.database.initialSize}</value></property>
</bean>
<bean id="oracleSqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="oracledataSource" />
<property name="configLocation" value="/WEB-INF/mybatis-config.xml"/>
</bean>
<bean id="oracleoardDao" class="org.mybatis.spring.mapper.MapperFactoryBean">
<property name="sqlSessionFactory" ref="oracleSqlSessionFactory"/>
<property name="mapperInterface" value="com.lodige.clcs.dao.oracleoardDao" />
</bean>
Maybe this is what you need
#Configuration
#MapperScan(basePackages = "com.neo.mapper.test1", sqlSessionTemplateRef =
"test1SqlSessionTemplate")
public class DataSource1Config {
#Bean(name = "test1DataSource")
#ConfigurationProperties(prefix = "spring.datasource.test1")
#Primary
public DataSource testDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "test1SqlSessionFactory")
#Primary
public SqlSessionFactory testSqlSessionFactory(#Qualifier("test1DataSource") DataSource dataSource) throws Exception {
SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
bean.setDataSource(dataSource);
return bean.getObject();
}
#Bean(name = "test1TransactionManager")
#Primary
public DataSourceTransactionManager testTransactionManager(#Qualifier("test1DataSource") DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean(name = "test1SqlSessionTemplate")
#Primary
public SqlSessionTemplate testSqlSessionTemplate(#Qualifier("test1SqlSessionFactory") SqlSessionFactory sqlSessionFactory) throws Exception {
return new SqlSessionTemplate(sqlSessionFactory);
}

Bean property 'myDataSource' is not writable or has an invalid setter method

Getting the following error with this configuration in Spring beans:
<bean name="/login.htm" class="com.virtusa.web.EmployeeController">
<property name="service" ref="service"></property>
</bean>
<bean id="service" class="com.virtusa.service.ServiceIMP">
<property name="dao" ref="EmployeeDAO"></property>
</bean>
<bean id="EmployeeDAO" class="com.virtusa.dao.EmployeeDAO">
<property name="myDataSource" ref="myDataSource" />
</bean>
My spring config file is as follows:
private DataSource myDataSource;
private JdbcTemplate jdbcTemplate;
public void setDataSource(DataSource myDataSource) {
this.myDataSource = myDataSource;
this.jdbcTemplate = new JdbcTemplate(myDataSource);
}
my dao
Since you have property name="myDataSource", your setter needs to be named setMyDataSource() rather than setDataSource().

Datasource initialization at server start up

We have an application where we have used spring for IOC. We have the dataSource bean configured in applicationContext.xml and that is referenced in other bean definations.
The dataSource bean defination looks like:
<bean id="dbDataSource" class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver" />
<property name="url"
value="jdbc:oracle:oci:#TESTDB" />
<property name="username" value="TESTUSER" />
<property name="password" value="TESTPWD" />
<property name="initialSize" value="50" />
<property name="maxActive" value="40" />
<property name="maxIdle" value="10" />
<property name="minIdle" value="10" />
<property name="maxWait" value="-1" />
</bean>
<bean id="serviceDAO" class="com.test.impl.ServiceDAOImpl">
<property name="dataSource" ref="dbDataSource" />
</bean>
ServiceDAOImpl looks as follows:
public class ServiceDAOImpl implements ServiceDAO {
private JdbcTemplate jdbcTemplate;
public void setDataSource(DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
#SuppressWarnings({ "rawtypes", "unchecked" })
public ValueObj readValue(String key) {
String query = "SELECT * FROM SERVICE_LOOKUP WHERE KEY=?";
/**
* Implement the RowMapper callback interface
*/
return (ValueObj) jdbcTemplate.queryForObject(query,
new Object[] { key }, new RowMapper() {
public Object mapRow(ResultSet resultSet, int rowNum)
throws SQLException {
return new ValueObj(resultSet.getString("KEY"),
resultSet.getString("VALUE"));
}
});
}
public ServiceDAOImpl() {
}
}
Now, at the server start up injection is happening fine and when we use the dataSource in serviceDAOImpl the connection is happening fine. But the very first time the database call is made it takes around 3 mins to get the response back. I think this is because the pool creation is done during the first call and we have set the parameter "initialSize" = 50 in applicationConext.xml.
So, to avoid this we need a way in which the pool can be created during the application startup itself and can be used directly.
Please suggest. Let me know if any clarification required.
Regards
Saroj
There's a work-around for this .You could force jdbcTemplate to use the
DB connection at startup. See the link here for detailed explanation .
<bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
<constructor-arg index="0" ref="dataSource"/>
<constructor-arg index="1" value="false"/>
</bean>
The second constructor-arg is the lazy Init flag.
Aravind A's solution is the preffered one, but just in case you don't want to define an extra bean you can point spring to your DAO's init method:
<bean id="serviceDAO" class="com.test.impl.ServiceDAOImpl" init-method="init">
<property name="dataSource" ref="dbDataSource" />
</bean>
and then define ServiceDAOImpl.init() which calls some sql like SELECT 1 FROM SERVICE_LOOKUP LIMIT 1 or even better some noop like SELECT 1:
public class ServiceDAOImpl implements ServiceDAO {
public void init() {
String query = "SELECT 1 FROM SERVICE_LOOKUP LIMIT 1";
int i = jdbcTemplate.queryForInt(query);
}
}

Resources