stored procedure isn't get executed - jdbc

I really need your help. I have a JDBC code that calls an oracle stored procedure. This is the class that has the code:
public class DAOImpl {
private String sql = "{call MAIN.SP_CALC(?,?,?,?,?)}";
private Connection conn = null;
public DAOImpl(Connection conn) {
this.conn = conn;
}
#Override
public synchronized void executeSP(String year, String month, Long id) throws SQLException {
try (CallableStatement cs = conn.prepareCall(sql);) {
cs.setObject(1, id);
cs.setObject(2, year);
cs.setObject(3, month);
cs.setObject(4, 0);
cs.registerOutParameter(5, Types.INTEGER);
cs.execute();
}
}
}
I have two clients for this class, a JavaSE Tester class and a stateless EJB. The JavaSE main() method have this code:
try {
DriverManager.registerDriver(new OracleDriver());
Connection conn = DriverManager.getConnection(DB_URL, USER, PASS);
DAOImpl dao = new DAOImpl(conn);
dao.executeSP("2015", "02", 99561010l);
} catch (SQLException ex) {
Logger.getLogger(SPTester.class.getName()).log(Level.SEVERE, null, ex);
}
The stateless EJB has this other code:
try {
conn = ds.getConnection();
DAO dao = DAOImpl(conn);
dao.executeSP("2015", "02", 99561010l);
} catch (SQLException ex) {
throw new SQLException(ex);
} finally {
conn.close();
}
ds is an instance variable injected to the EJB as a DataSource that comes from a JDBC Connection Pool declared in glassfish of type oracle.jdbc.xa.client.OracleXADataSource. The variables DB_URL, USER, PASS of the JavaSE programm, have the same values that use the JDBC Connection Pool
As you'll notice, the only difference is the Connection object. Both JavaSE and EJB use the ojdbc6.jar driver. Both uses java 1.7.0_u2, that's why I use try-with-resources.
The problem is that only the JavaSE works! Both of them returns after some seconds of processing but only the JavaSE programm works. I have tried many things.. CMT and BMT EJB, wrapped and unwrapped types of id, synchronize and unsynchronize method, etc. I need this code working in the EJB :(
What's the wrong with my EJB?
thanks in advance

This is the CMT version of the EJB
#Stateless
#LocalBean
#Interceptors({PropagateSessionContextInterceptor.class})
public class CalcBean extends BaseContextSessionBean {
#Resource(mappedName = "jdbc/maestro")
protected DataSource ds;
public Connection conn = null;
public void executeSP(String year, String month, Long id) throws SQLException {
try {
conn = ds.getConnection();
DAO dao = DAOImpl(conn);
dao.executeSP("2015", "02", 99561010l);
} catch (SQLException ex) {
throw new SQLException(ex);
} finally {
conn.close();
}
}
}
This is the BMT version of the EJB
#Stateless
#LocalBean
#Interceptors({PropagateSessionContextInterceptor.class})
#TransactionManagement(TransactionManagementType.BEAN)
public class CalcBean extends BaseContextSessionBean {
#Resource(mappedName = "jdbc/maestro")
protected DataSource ds;
public Connection conn = null;
#Resource
UserTransaction tx;
public void executeSP(String year, String month, Long id) throws SQLException {
try {
try {
tx.begin();
conn = ds.getConnection();
DAO dao = DAOImpl(conn);
dao.executeSP("2015", "02", 99561010l);
tx.commit();
} catch (SQLException ex) {
throw new SQLException(ex);
} finally {
conn.close();
}
} catch (Exception e) {
throw new EJBException(e);
}
}
}
public class BaseContextSessionBean {
#Resource
public SessionContext ctx;
}
public class PropagateSessionContextInterceptor {
#AroundInvoke
public Object myInterceptor(InvocationContext ctx) throws Exception
{
BaseContextSessionBean sb = (BaseContextSessionBean)ctx.getTarget();
ControlListener.setSessionContext(sb.ctx);
Object o = ctx.proceed();
return o;
}
}
public interface DAO {
public void executeSP(String year, String month, Long id) throws SQLException;
}
I don't know the reason of BaseContextSessionBean and PropagateSessionContextInterceptor. Originally, the EJB used a useless JDBC framework (that uses those classes), so I replaced it with the DAOImpl class. I use a DAO interface to decouple the implementation, the DaoImpl is been execute (in the EJB, the DAOImpl implements DAO, I forgot to put that in the earlier post)

Related

How to mock bean HikariDataSource correctly?

I wrote integration test using Mockito, but it works when connection to database was set. Actually test just check possibility access some endpoints and not related to the data access layer. So I don't need database for it yet.
Reason of failing test when database is down - HikariDatasource check connection to the database when spring instantiates context. Mocking doesn't return Connection and it lead to the fail of application. Solution that i have found is use hsql in memory database, but for me it looks like work around. Probably exists other solution providing some fake data?
Not sure that this is elegant solution, but I need to force work tests like this
mockMvc.perform(
post("/some").contentType(MediaType.APPLICATION_JSON_UTF8)
.content(objectMapper.writeValueAsString(someDto))
.header(HttpHeaders.AUTHORIZATION, AUTH_HEADER)
.accept(MediaType.APPLICATION_JSON_UTF8)
).andExpect(status().is(201));
After debugging and searching I have found solution that allowed start container without database in memory.
#TestConfiguration
#ComponentScan(basePackages = "com.test")
#ActiveProfiles("test")
public class TestConfig {
//Other Beans
#Bean
public DataSource getDatasource() {
return new MockDataSource();
}
}
class MockDataSource implements DataSource {
#Override
public Connection getConnection() throws SQLException {
return createMockConnection();
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return getConnection();
}
#Override
public PrintWriter getLogWriter() throws SQLException {
return null;
}
#Override
public void setLogWriter(PrintWriter out) throws SQLException {
}
#Override
public void setLoginTimeout(int seconds) throws SQLException {
}
#Override
public int getLoginTimeout() throws SQLException {
return 0;
}
public Logger getParentLogger() throws SQLFeatureNotSupportedException {
return null;
}
#Override
public <T> T unwrap(Class<T> iface) throws SQLException {
return null;
}
#Override
public boolean isWrapperFor(Class<?> iface) throws SQLException {
return false;
}
public static Connection createMockConnection() throws SQLException {
// Setup mock connection
final Connection mockConnection = mock(Connection.class);
// Autocommit is always true by default
when(mockConnection.getAutoCommit()).thenReturn(true);
// Handle Connection.createStatement()
Statement statement = mock(Statement.class);
when(mockConnection.createStatement()).thenReturn(statement);
when(mockConnection.createStatement(anyInt(), anyInt())).thenReturn(statement);
when(mockConnection.createStatement(anyInt(), anyInt(), anyInt())).thenReturn(statement);
when(mockConnection.isValid(anyInt())).thenReturn(true);
// Handle Connection.prepareStatement()
PreparedStatement mockPreparedStatement = mock(PreparedStatement.class);
when(mockConnection.prepareStatement(anyString())).thenReturn(mockPreparedStatement);
when(mockConnection.prepareStatement(anyString(), anyInt())).thenReturn(mockPreparedStatement);
when(mockConnection.prepareStatement(anyString(), (int[]) any())).thenReturn(mockPreparedStatement);
when(mockConnection.prepareStatement(anyString(), (String[]) any())).thenReturn(mockPreparedStatement);
when(mockConnection.prepareStatement(anyString(), anyInt(), anyInt())).thenReturn(mockPreparedStatement);
when(mockConnection.prepareStatement(anyString(), anyInt(), anyInt(), anyInt())).thenReturn(mockPreparedStatement);
doAnswer((Answer<Void>) invocation -> null).doNothing().when(mockPreparedStatement).setInt(anyInt(), anyInt());
ResultSet mockResultSet = mock(ResultSet.class);
when(mockPreparedStatement.executeQuery()).thenReturn(mockResultSet);
when(mockResultSet.getString(anyInt())).thenReturn("aString");
when(mockResultSet.next()).thenReturn(true);
// Handle Connection.prepareCall()
CallableStatement mockCallableStatement = mock(CallableStatement.class);
when(mockConnection.prepareCall(anyString())).thenReturn(mockCallableStatement);
when(mockConnection.prepareCall(anyString(), anyInt(), anyInt())).thenReturn(mockCallableStatement);
when(mockConnection.prepareCall(anyString(), anyInt(), anyInt(), anyInt())).thenReturn(mockCallableStatement);
ResultSet mockResultSetTypeInfo = mock(ResultSet.class);
DatabaseMetaData mockDataBaseMetadata = mock(DatabaseMetaData.class);
when(mockDataBaseMetadata.getDatabaseProductName()).thenReturn("PostgreSQL");
when(mockDataBaseMetadata.getDatabaseMajorVersion()).thenReturn(8);
when(mockDataBaseMetadata.getDatabaseMinorVersion()).thenReturn(2);
when(mockDataBaseMetadata.getConnection()).thenReturn(mockConnection);
when(mockDataBaseMetadata.getTypeInfo()).thenReturn(mockResultSetTypeInfo);
when(mockConnection.getMetaData()).thenReturn(mockDataBaseMetadata);
// Handle Connection.close()
doAnswer((Answer<Void>) invocation -> null).doThrow(new SQLException("Connection is already closed")).when(mockConnection).close();
// Handle Connection.commit()
doAnswer((Answer<Void>) invocation -> null).doThrow(new SQLException("Transaction already committed")).when(mockConnection).commit();
// Handle Connection.rollback()
doAnswer((Answer<Void>) invocation -> null).doThrow(new SQLException("Transaction already rolled back")).when(mockConnection).rollback();
return mockConnection;
}
}
Mocking DataSource allows start container and provide post call to the controller using MockMvc.

Spring Boot + Hibernate + Oracle schema multitenancy

I'm trying to get a schema-based multitenancy solution working, similar to this example but with Oracle instead of Postgres.
For example, I have three schemas: FOO, BAR, and BAZ. BAR and BAZ each have a table called MESSAGES. FOO has been granted SELECT on both BAR.MESSAGES and BAZ.MESSAGES. So if I connect as FOO and then execute
SELECT * FROM BAR.MESSAGES;
then I get a result as expected. But if I leave out the schema name (e.g. SELECT * FROM MESSAGES), then I get ORA-00942: table or view does not exist (the connection is using the wrong schema).
Here's my Dao / repository:
#Repository
public interface MessageDao extends CrudRepository<Foo, Long> {
}
The controller:
#GetMapping("/findAll")
public List<Message> findAll() {
TenantContext.setCurrentTenant("BAR");
var result = messageDao.findAll();
return result;
}
The Config:
#Configuration
public class MessageConfig {
#Bean
public JpaVendorAdapter jpaVendorAdapter() {
return new HibernateJpaVendorAdapter();
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource,
MultiTenantConnectionProvider multiTenantConnectionProvider,
CurrentTenantIdentifierResolver tenantIdentifierResolver) {
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(dataSource);
em.setPackagesToScan(Message.class.getPackageName());
em.setJpaVendorAdapter(this.jpaVendorAdapter());
Map<String, Object> jpaProperties = new HashMap<>();
jpaProperties.put(Environment.MULTI_TENANT, MultiTenancyStrategy.SCHEMA);
jpaProperties.put(Environment.MULTI_TENANT_CONNECTION_PROVIDER, multiTenantConnectionProvider);
jpaProperties.put(Environment.MULTI_TENANT_IDENTIFIER_RESOLVER, tenantIdentifierResolver);
jpaProperties.put(Environment.FORMAT_SQL, true);
em.setJpaPropertyMap(jpaProperties);
return em;
}
The MultitenantConnectionProvider:
#Component
public class MultiTenantConnectionProviderImpl implements MultiTenantConnectionProvider {
#Autowired
private DataSource dataSource;
#Override
public Connection getAnyConnection() throws SQLException {
return dataSource.getConnection();
}
#Override
public void releaseAnyConnection(Connection connection) throws SQLException {
connection.close();
}
#Override
public Connection getConnection(String currentTenantIdentifier) throws SQLException {
String tenantIdentifier = TenantContext.getCurrentTenant();
final Connection connection = getAnyConnection();
try (Statement statement = connection.createStatement()) {
statement.execute("ALTER SESSION SET CURRENT_SCHEMA = BAR");
} catch (SQLException e) {
throw new HibernateException("Problem setting schema to " + tenantIdentifier, e);
}
return connection;
}
#Override
public void releaseConnection(String tenantIdentifier, Connection connection) throws SQLException {
try (Statement statement = connection.createStatement()) {
statement.execute("ALTER SESSION SET CURRENT_SCHEMA = FOO");
} catch (SQLException e) {
throw new HibernateException("Problem setting schema to " + tenantIdentifier, e);
}
connection.close();
}
#SuppressWarnings("rawtypes")
#Override
public boolean isUnwrappableAs(Class unwrapType) {
return false;
}
#Override
public <T> T unwrap(Class<T> unwrapType) {
return null;
}
#Override
public boolean supportsAggressiveRelease() {
return true;
}
}
And the TenantIdentifierResolver (though not really relevant because I'm hard-coding the tenants right now in the ConnectionProviderImpl above):
#Component
public class TenantIdentifierResolver implements CurrentTenantIdentifierResolver {
#Override
public String resolveCurrentTenantIdentifier() {
String tenantId = TenantContext.getCurrentTenant();
if (tenantId != null) {
return tenantId;
}
return "BAR";
}
#Override
public boolean validateExistingCurrentSessions() {
return true;
}
}
Any ideas as to why the underlying Connection isn't switching schemas as expected?
UPDATE 1
Maybe it's something to do with the underlying Oracle connection. There is a property on OracleConnection named CONNECTION_PROPERTY_CREATE_DESCRIPTOR_USE_CURRENT_SCHEMA_FOR_SCHEMA_NAME. The documentation says:
The user also has an option to append the CURRENT_USER value to the
ADT name to obtain fully qualified name by setting this property to
true. Note that it takes a network round trip to fetch the
CURRENT_SCHEMA value.
But the problem remains even if I set this to true (-Doracle.jdbc.createDescriptorUseCurrentSchemaForSchemaName=true). This may be because the "username" on the Connection is still "FOO", even after altering the sesssion to set the schema to "BAR" (currentSchema on the Connection is "BAR"). But that would mean that the OracleConnection documentation is incorrect, wouldn't it?
UPDATE 2
I did not include the fact that we are using Spring Data JPA here as well. Maybe that has something to do with the problem?
I have found that if I include the schema name hard-coded in the entity then it works (e.g. #Table(schema="BAR")), but having a hard-coded value there is not an acceptable solution.
It might also work if we rewrite the queries as native #Query and then include {h-schema} in the SQL, but in Hibernate this is the default schema, not the 'current' (dynamic) schema, so it's not quite right either.
It turns out that setting the current tenant on the first line of the Controller like that (TenantContext.setCurrentTenant("BAR")) is "too late" (Spring has already created a transaction?). I changed the implementation to use a servlet filter to set the tenant id from a header to a request attribute, and then fetch that attribute in the TenantIdentifierResolver, instead of using the TenantContext. Now it works as it should, without any of the stuff I mentioned in the updates.

Change Multitenancy sessions manually

I need to create a multitenanacy application with ability to switch between schemas inside my java-code (not based on a user request).
I've read articles:
https://fizzylogic.nl/2016/01/24/make-your-spring-boot-application-multi-tenant-aware-in-2-steps/
http://www.greggbolinger.com/tenant-per-schema-with-spring-boot/
Solution works fine, when the schema is passed in Rest-request.
However I need to implement the following logic:
public void compare(String originalSchema, String secondSchema){
TenantContext.setCurrentTenant(originalSchema);
List<MyObject> originalData = myRepository.findData();
TenantContext.setCurrentTenant(secondSchema);
List<MyObject> migratedData = myRepository.findData();
}
The point is, that connection is not switched, when I manually set up TenenantContext. MultiTenantConnectionProviderImpl.getConnection is invoked only on the first call to my repository.
#Component
public class MultiTenantConnectionProviderImpl implements MultiTenantConnectionProvider {
#Override
public Connection getConnection(String tenantIdentifier) throws SQLException {
final Connection connection = getAnyConnection();
try {
connection.createStatement().execute( "ALTER SESSION SET CURRENT_SCHEMA = " + tenantIdentifier );
}
catch ( SQLException e ) {
throw new HibernateException(
"Could not alter JDBC connection to specified schema [" + tenantIdentifier + "]",e);
}
return connection;
}
}
Is it possible to force switching sessions?
Found a hard-coded solution.
#Service
public class DatabaseSessionManager {
#PersistenceUnit
private EntityManagerFactory entityManagerFactory;
public void bindSession() {
if (!TransactionSynchronizationManager.hasResource(entityManagerFactory)) {
EntityManager entityManager = entityManagerFactory.createEntityManager();
TransactionSynchronizationManager.bindResource(entityManagerFactory, new EntityManagerHolder(entityManager));
}
}
public void unbindSession() {
EntityManagerHolder emHolder = (EntityManagerHolder) TransactionSynchronizationManager
.unbindResource(entityManagerFactory);
EntityManagerFactoryUtils.closeEntityManager(emHolder.getEntityManager());
}
}
Each block, loading data in a new tenantContext should execute the following:
databaseSessionManager.unbindSession();
TenantContext.setCurrentTenant(schema);
databaseSessionManager.bindSession();
//execute selects
Well, you need it
public interface Service {
List<MyObject> myObjects();
}
#Service
#Transactional(propagation = Propagation.REQUIRES_NEW)
public class ServiceImpl implements Service {
#Autowired
private MyRepository myRepository;
#Override
public List<MyObject> myObjects() {
return myRepository.findData();
}
}
#Service
public class AnotherService() {
#Autowired
private Service service;
public void compare(String originalSchema, String secondSchema){
TenantContext.setCurrentTenant(originalSchema);
List<MyObject> originalData = service.myObjects();
TenantContext.setCurrentTenant(secondSchema);
List<MyObject> migratedData = service.myObjects();
}
}
Try using
spring.jpa.open-in-view=false
in your application.properties file.
More info on this
What is this spring.jpa.open-in-view=true property in Spring Boot?
Hope this helps..

Define hive-jdbc JNDI data source on WebSphere

I am trying to setup a JNDI Data Source on WebSphere 8.5.5.11, using hive-jdbc.jar.
When using the console in WebSphere, and using the form to create a new JDBC provider, there is a field for the implementation class name. WebSphere requires that the class implements javax.sql.XADataSource or javax.sql.ConnectionPoolDataSource. However, the hive-jdbc driver implements non of those, it implements only java.sql.DataSource.
For this reason, it doesn't work, WebSphere reports an error when trying to save the form.
Any idea what can I do about this?
You can write a trivial implementation of javax.sql.ConnectionPoolDataSource that delegates to the javax.sql.DataSource implementation. Here is an example,
package example.datasource;
import java.sql.*;
import javax.sql.*;
public class HiveConnectionPoolDataSource extends org.apache.hive.jdbc.HiveDataSource implements ConnectionPoolDataSource {
public PooledConnection getPooledConnection() throws SQLException {
return new HivePooledConnection(null, null);
}
public PooledConnection getPooledConnection(String user, String password) throws SQLException {
return new HivePooledConnection(user, password);
}
public boolean isWrapperFor(Class<?> iface) throws SQLException {
return ConnectionPoolDataSource.class.equals(iface) || super.isWrapperFor(iface);
}
public <T> T unwrap(Class<T> iface) throws SQLException {
return ConnectionPoolDataSource.class.equals(iface) ? (T) this : super.unwrap(iface);
}
class HivePooledConnection implements PooledConnection {
private Connection con;
private final String user;
private final String password;
HivePooledConnection(String user, String password) {
this.user = user;
this.password = password;
}
public void addConnectionEventListener(ConnectionEventListener listener) {}
public void addStatementEventListener(StatementEventListener listener) {}
public void close() throws SQLException {
if (con != null) {
con.close();
con = null;
}
}
public Connection getConnection() throws SQLException {
if (con == null || con.isClosed()) {
con = user == null
? HiveConnectionPoolDataSource.this.getConnection()
: HiveConnectionPoolDataSource.this.getConnection(user, password);
return con;
} else
throw new IllegalStateException();
}
public void removeConnectionEventListener(ConnectionEventListener listener) {}
public void removeStatementEventListener(StatementEventListener listener) {}
}
}
Package your compiled class in a JAR alongside the JDBC driver JAR(s), and configure your custom JDBC provider in WebSphere Application Server to point at this JAR as though it were a part of the JDBC driver. Specify the implementation class name as example.datasource.HiveConnectionPoolDataSource or whatever package/name you chose for your own implementation. You should then be able to use the JDBC driver.
Also adding a link to the WebSphere Application Server request for enhancements page if anyone wants to request that support for javax.sql.DataSource be added.

Transactions with Guice and JDBC - Solution discussion

In my application, I need to use pure JDBC together with Guice. However, Guice doesn't provide any built-in support to manage transactions. guice-persist only provides support based on JPA, which I cannot use.
so I tried to implement a simple solution to manage transactions with Guice and JDBC. here is the first version:
use TransactionHolder to store the transaction per thread.
public class JdbcTransactionHolder {
private static ThreadLocal<JdbcTransaction> currentTransaction = new ThreadLocal<JdbcTransaction>();
public static void setCurrentTransaction(JdbcTransaction transaction) {
currentTransaction.set(transaction);
}
public static JdbcTransaction getCurrentTransaction() {
return currentTransaction.get();
}
public static void removeCurrentTransaction() {
currentTransaction.remove();
}
}
implements a transaction manager for JDBC, for now only begin(), getTransaction(), commit() and rollback() method:
public class JdbcTransactionManager implements TransactionManager {
#Inject
private DataSource dataSource;
#Override
public void begin() throws NotSupportedException, SystemException {
logger.debug("Start the transaction");
try {
JdbcTransaction tran = JdbcTransactionHolder.getCurrentTransaction();
Connection conn = null;
if(tran == null) {
conn = dataSource.getConnection();
}
else {
conn = tran.getConnection();
}
// We have to put the connection in the holder so that we can get later
// from the holder and use it in the same thread
logger.debug("Save the transaction for thread: {}.", Thread.currentThread());
JdbcTransactionHolder.setCurrentTransaction(new JdbcTransaction(conn));
} catch (Exception e) {
throw new RuntimeException(e);
}
}
#Override
public void commit() throws RollbackException, HeuristicMixedException,
HeuristicRollbackException, SecurityException,
IllegalStateException, SystemException {
logger.debug("Commit the transaction");
try {
logger.debug("Get the connection for thread: {}.", Thread.currentThread());
Transaction transaction = JdbcTransactionHolder.getCurrentTransaction();
transaction.commit();
}
catch(Exception e) {
throw new RuntimeException(e);
}
finally {
JdbcTransactionHolder.removeCurrentTransaction();
}
}
#Override
public Transaction getTransaction() throws SystemException {
logger.debug("Get transaction.");
final JdbcTransaction tran = JdbcTransactionHolder.getCurrentTransaction();
if(tran == null) {
throw new DBException("No transaction is availble. TransactionManager.begin() is probably not yet called.");
}
return tran;
}
#Override
public void rollback() throws IllegalStateException, SecurityException,
SystemException {
logger.debug("Rollback the transaction");
try {
logger.debug("Get the transaction for thread: {}.", Thread.currentThread());
Transaction conn = JdbcTransactionHolder.getCurrentTransaction();
conn.commit();
}
catch(Exception e) {
throw new RuntimeException(e);
}
finally {
JdbcTransactionHolder.removeCurrentTransaction();
}
}
}
implement a wrapper for DataSource which can get the current connection from the transaction holder if a transaction has been started:
public class JdbcDataSource implements DataSource {
private final static org.slf4j.Logger logger = LoggerFactory.getLogger(JdbcDataSource.class);
private DataSource dataSource;
public JdbcDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
#Override
public PrintWriter getLogWriter() throws SQLException {
return dataSource.getLogWriter();
}
#Override
public int getLoginTimeout() throws SQLException {
return dataSource.getLoginTimeout();
}
#Override
public Logger getParentLogger() throws SQLFeatureNotSupportedException {
return dataSource.getParentLogger();
}
#Override
public void setLogWriter(PrintWriter out) throws SQLException {
this.dataSource.setLogWriter(out);
}
#Override
public void setLoginTimeout(int seconds) throws SQLException {
this.dataSource.setLoginTimeout(seconds);
}
#Override
public boolean isWrapperFor(Class<?> arg0) throws SQLException {
return this.isWrapperFor(arg0);
}
#Override
public <T> T unwrap(Class<T> iface) throws SQLException {
return this.unwrap(iface);
}
#Override
public Connection getConnection() throws SQLException {
JdbcTransaction transaction = JdbcTransactionHolder.getCurrentTransaction();
if(transaction != null) {
// we get the connection from the transaction
logger.debug("Transaction exists for the thread: {}.", Thread.currentThread());
return transaction.getConnection();
}
Connection conn = this.dataSource.getConnection();
conn.setAutoCommit(false);
return conn;
}
#Override
public Connection getConnection(String username, String password)
throws SQLException {
JdbcTransaction transaction = JdbcTransactionHolder.getCurrentTransaction();
if(transaction != null) {
// we get the connection from the transaction
logger.debug("Transaction exists for the thread: {}.", Thread.currentThread());
return transaction.getConnection();
}
return this.dataSource.getConnection(username, password);
}
}
then create a DataSourceProvider so that we can inject DataSource to any POJO using guice:
public class DataSourceProvider implements Provider {
private static final Logger logger = LoggerFactory.getLogger(DataSourceProvider.class);
private DataSource dataSource;
public DataSourceProvider() {
JdbcConfig config = getConfig();
ComboPooledDataSource pooledDataSource = new ComboPooledDataSource();
try {
pooledDataSource.setDriverClass(config.getDriver());
} catch (Exception e) {
throw new RuntimeException(e);
}
pooledDataSource.setJdbcUrl(config.getUrl());
pooledDataSource.setUser(config.getUsername());
pooledDataSource.setPassword(config.getPassword() );
pooledDataSource.setMinPoolSize(config.getMinPoolSize());
pooledDataSource.setAcquireIncrement(5);
pooledDataSource.setMaxPoolSize(config.getMaxPoolSize());
pooledDataSource.setMaxStatements(config.getMaxStatementSize());
pooledDataSource.setAutoCommitOnClose(false);
this.dataSource = new JdbcDataSource(pooledDataSource);
}
private JdbcConfig getConfig() {
JdbcConfig config = new JdbcConfig();
Properties prop = new Properties();
try {
//load a properties file from class path, inside static method
prop.load(JdbcConfig.class.getResourceAsStream("/database.properties"));
//get the property value and print it out
config.setDriver(prop.getProperty("driver"));
config.setUrl(prop.getProperty("url"));
config.setUsername(prop.getProperty("username"));
config.setPassword(prop.getProperty("password"));
String maxPoolSize = prop.getProperty("maxPoolSize");
if(maxPoolSize != null) {
config.setMaxPoolSize(Integer.parseInt(maxPoolSize));
}
String maxStatementSize = prop.getProperty("maxStatementSize");
if(maxStatementSize != null) {
config.setMaxStatementSize(Integer.parseInt(maxStatementSize));
}
String minPoolSize = prop.getProperty("minPoolSize");
if(minPoolSize != null) {
config.setMinPoolSize(Integer.parseInt(minPoolSize));
}
}
catch (Exception ex) {
logger.error("Failed to load the config file!", ex);
throw new DBException("Cannot read the config file: database.properties. Please make sure the file is present in classpath.", ex);
}
return config;
}
#Override
public DataSource get() {
return dataSource;
}
and then implement TransactionalMethodInterceptor to manage the transaction for the method with Transactional annotation:
public class TransactionalMethodInterceptor implements MethodInterceptor {
private final static Logger logger = LoggerFactory.getLogger(TransactionalMethodInterceptor.class);
#Inject
private JdbcTransactionManager transactionManager;
#Override
public Object invoke(MethodInvocation method) throws Throwable {
try {
// Start the transaction
transactionManager.begin();
logger.debug("Start to invoke the method: " + method);
Object result = method.proceed();
logger.debug("Finish invoking the method: " + method);
transactionManager.commit();
return result;
} catch (Exception e) {
logger.error("Failed to commit transaction!", e);
try {
transactionManager.rollback();
}
catch(Exception ex) {
logger.warn("Cannot roll back transaction!", ex);
}
throw e;
}
}
}
Finally, the code to put all together so that Guice can inject the instances:
bind(DataSource.class).toProvider(DataSourceProvider.class).in(Scopes.SINGLETON);
bind(TransactionManager.class).to(JdbcTransactionManager.class);
TransactionalMethodInterceptor transactionalMethodInterceptor = new TransactionalMethodInterceptor();
requestInjection(transactionalMethodInterceptor);
bindInterceptor(Matchers.any(), Matchers.annotatedWith(Transactional.class), transactionalMethodInterceptor);
bind(TestDao.class).to(JdbcTestDao.class);
bind(TestService.class).to(TestServiceImpl.class);
I use c3p0 for the datasource pool. so, it works just fine in my test.
I find another related question: Guice, JDBC and managing database connections
but so far I haven't find any similar approach, except something in SpringFramework. but even the implementation in Spring seems quite complex.
I would like to ask if anyone has any suggestion for this solution.
thanks.

Resources