#Transactional timeout is not working in spring boot application - spring

I am using Spring Boot and JdbcTemplate in my application. I am trying to implement timeout for select query but its not working.
My query takes more time than timeout time but still its not giving timeout exception.
#Service
#Slf4j
public class SchedulerService
{
#Autowired
UserService userExportService;
#Autowired
private userExportDao userExportDao;
#Value("${queryTest}")
private String queryFetchByExportFlagCustom;
#Scheduled(fixedDelay=10000)
public void triggerUserExport() {
List<UserExportCustom> userList;
try {
userList = userExportDao.findByExportFlag(0, queryFetchByExportFlagCustom);
userExportService.exportUsers(userList, schedulerCount);
} catch (Exception e) {
e.printStackTrace();
}
}
}
#Repository
#Slf4j
public class UserExportDao extends JdbcDaoImpl<UserExportCustom, Long>
{
#Autowired
BeanPropertyRowMapper<UserExportCustom> userExportCustomRowMapper;
#Transactional(readOnly = true, timeout = 1)
public List<UserExportCustom> findByExportFlag(Integer exportFlag, String query)
{
List<UserExportCustom> userExportCustomList = null;
try
{
SqlParameterSource namedParameters = new MapSqlParameterSource().addValue("exportFlag", exportFlag, Types.INTEGER);
userExportCustomList = namedParameterJdbcTemplate.query(query, namedParameters,userExportCustomRowMapper);
}
catch (Exception e)
{
e.printStackTrace();
log.error("Error in findByExportFlag: \n" + e);
}
return userExportCustomList;
}
}
public class JdbcDaoImpl<T, ID> implements JdbcDao<T, ID> {
#Autowired
protected JdbcTemplate jdbcTemplate;
#Autowired
protected NamedParameterJdbcTemplate namedParameterJdbcTemplate;
#Override
public List<T> findAll() {
throw new IllegalStateException();
}
#Override
public T save(T api) {
throw new IllegalStateException();
}
#Override
public T update(T api) {
throw new IllegalStateException();
}
#Override
public T saveOrUpdate(T api) {
throw new IllegalStateException();
}
#Override
public T findOne(String unique) {
throw new IllegalStateException();
}
#Override
public T findOneById(ID id) {
throw new IllegalStateException();
}
#Override
public void delete(ID id) {
throw new IllegalStateException();
}
}
If query take more than 1 second than it should give timeout exception but it does not.

try{}catch{} will lead #Transactional to fail ,remove try catch

Related

Error creating bean with name 'securityConfig' defined in file

I´m trying to make a Unit test of the methods of my project which contains spring security.
When I run the project it works normally, but when I try to unit test the methods it gives me this error.
Description:
Pramenter 0 of constructor in ...config.secutityConfig required a bean of type 'org.springframeword.security.userdetails.UserDetailsService' that could not be found
Action:
Consider defining a bean of type 'org.springframework.security.core.userdetails.UserDetailsService in your configuration'
This is my SecurityConfig.java code:
#Configuration
#EnableWebSecurity
#RequiredArgsConstructor
public class SecurityConfig extends WebSecurityConfigurerAdapter {
#Autowired
private final UserDetailsService userDetailsService;
private final BCryptPasswordEncoder bCryptPasswordEncoder;
#Override
protected void configure(AuthenticationManagerBuilder auth) throws Exception {
auth.userDetailsService(userDetailsService).passwordEncoder(bCryptPasswordEncoder);
}
#Override
protected void configure(HttpSecurity http) throws Exception {
CustomAuthenticationFilter customAuthenticationFilter = new CustomAuthenticationFilter(authenticationManagerBean());
http.csrf().disable();
http.sessionManagement().sessionCreationPolicy(SessionCreationPolicy.STATELESS);
http.authorizeRequests().antMatchers(GET, "/user/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(POST, "/user/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(PUT, "/user/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(DELETE, "/user/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(GET, "/category/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(POST, "/category/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(PUT, "/category/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(DELETE, "/category/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(GET, "/product/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(POST, "/product/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(PUT, "/product/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(DELETE, "/product/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(GET, "/shoppingcart/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(POST, "/shoppingcart/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(PUT, "/shoppingcart/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(DELETE, "/shoppingcart/**").hasAnyAuthority("ADMIN");
http.authorizeRequests().antMatchers(GET, "/product/**").hasAnyAuthority("USER");
http.authorizeRequests().anyRequest().authenticated();
http.addFilter(customAuthenticationFilter);
http.addFilterBefore(new CustomAuthorizationFilter(), UsernamePasswordAuthenticationFilter.class);
}
#Bean
#Override
public AuthenticationManager authenticationManagerBean() throws Exception{
return super.authenticationManagerBean();
}
}
This is the test I'm trying to make, it's basically to save a category in the DB.
#WebMvcTest(controllers = CategoryRest.class)
public class CategoryRestTest extends AbstractUnitRestTest {
#MockBean
private CategoryService categoryService;
#Test
public void saveCategory() throws Exception {
CreateCategoryCmd cmd = new CreateCategoryCmd("Tehnika", "TV, USB", Collections.emptySet());
String jsonInString = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(cmd);
Category category = CategoryBuilder.categoryBelaTehnika();
doReturn(category).when(categoryService).save(any(CreateCategoryCmd.class));
mockMvc.perform(post("/category/save")
.contentType(MediaType.APPLICATION_JSON)
.content(jsonInString)).andDo(print())
.andExpect(status().isOk())
.andExpect(jsonPath("$.name").value(category.getName()));
}
And this is my Userdetails:
#Service
#Transactional
#RequiredArgsConstructor
#Slf4j
public class UserServiceImpl implements UserService, UserDetailsService {
private final static Logger LOGGER = LoggerFactory.getLogger(UserServiceImpl.class);
private final UserDAO userDAO;
private final PayPalAccountDAO payPalAccountDAO;
private final RoleDao roleDao;
private final ShoppingCartDAO shoppingCartDAO;
private final PasswordEncoder passwordEncoder;
#Override
#Transactional
public User save(CreateUserCmd cmd) throws ServiceException {
User user = UserMapper.INSTANCE.createUserCmdToUser(cmd);
List<Role> roles = new ArrayList<>();
List<Role> role = new ArrayList<>();
Set<Role> ro = new HashSet<>();
roles = roleDao.findAll();
role.add(roles.get(0));
ro.addAll(role);
try {
user.setRoles(ro);
user.setPassword(passwordEncoder.encode(cmd.getPassword()));
user = userDAO.save(user);
} catch (DAOException e) {
LOGGER.error(null, e);
throw new ServiceException(ErrorCode.ERR_GEN_001, "Saving of user failed!", e);
}
return user;
}
#Override
public List<UserResult> findAll() {
return UserMapper.INSTANCE.listUserToListUserResult(userDAO.findAll());
}
#Override
public UserInfo findById(Long id) {
return UserMapper.INSTANCE.userToUserInfo(userDAO.findOne(id));
}
#Override
public void addAccount(PayPalAccount payPalAccount, User user) throws ServiceException{
try{
payPalAccount.setUserID(user);
payPalAccountDAO.save(payPalAccount);
} catch (DAOException e){
LOGGER.error(null, e);
throw new ServiceException(ErrorCode.ERR_GEN_001, "creating account failed");
}
}
#Override
public void addCart(ShoppingCart shoppingCart, User user) throws ServiceException {
try{
shoppingCart.setUser(user);
shoppingCart.setStatus(Status.NEW);
shoppingCart.setPrice(new BigDecimal(0));
shoppingCartDAO.save(shoppingCart);
} catch (DAOException e) {
LOGGER.error(null, e);
throw new ServiceException(ErrorCode.ERR_GEN_001, "creating cart failed ");
}
}
#Override
public void addRole(addRoleCmd cmd) throws ServiceException {
User user;
try{
user = userDAO.findOne(cmd.getId());
if(user == null){
throw new ServiceException(ErrorCode.ERR_GEN_002);
}
UserMapper.INSTANCE.addingRoletoUser(user, cmd);
user.getRoles().addAll(cmd.getRoles().stream()
.map(v ->{
Role rr = roleDao.findOne(v.getId());
rr.getUser().add(user);
return rr;
}).collect(Collectors.toSet()));
userDAO.merge(user);
}catch (DAOException e){
LOGGER.error(null, e);
throw new ServiceException(ErrorCode.ERR_GEN_001, "failed while adding new role", e);
}
}
#Override
public void update(UpdateUserCmd cmd) throws ServiceException {
User user;
try {
// check if entity still exists
user = userDAO.findOne(cmd.getId());
if (user == null) {
throw new ServiceException(ErrorCode.ERR_GEN_002);
}
UserMapper.INSTANCE.updateUserCmdToUser(user, cmd);
PayPalAccount palAccount = cmd.getPayPalAccount();
Set<ShoppingCart> shoppingCarts = cmd.getShoppingCarts();
for (ShoppingCart cart: shoppingCarts) {
addCart(cart, user);
}
user.setAccount(palAccount);
addAccount(palAccount, user);
userDAO.merge(user);
} catch (DAOException e) {
LOGGER.error(null, e);
throw new ServiceException(ErrorCode.ERR_GEN_001, "Update of user failed!", e);
}
}
#Override
public void delete(Long id) throws ServiceException {
User user = userDAO.findOne(id);
if (user != null) {
try {
userDAO.delete(user);
} catch (DAOException e) {
LOGGER.error(null, e);
throw new ServiceException(ErrorCode.ERR_GEN_001, "Delete of user failed!", e);
}
} else {
throw new ServiceException(ErrorCode.ERR_CAT_001, "User does not exist!");
}
}
#Override
public UserDetails loadUserByUsername(String username) throws UsernameNotFoundException {
User user = userDAO.findByUsername(username);
if(user == null){
LOGGER.error("User not found");
throw new UsernameNotFoundException("User not found in the database");
} else{
LOGGER.info("User found in the DB");
}
Collection<SimpleGrantedAuthority> authorities = new HashSet<>();
user.getRoles().forEach(role -> {
authorities.add(new SimpleGrantedAuthority(role.getName()));
});
return new org.springframework.security.core.userdetails.User(user.getUsername(), user.getPassword(), authorities);
}
}
Any suggestions?
The problem here is that if you read the documentation for WebMvcTest it says straight out in the second paragraph:
Using this annotation will disable full auto-configuration and instead apply only configuration relevant to MVC tests (i.e. #Controller, #ControllerAdvice, #JsonComponent, Converter/GenericConverter, Filter, WebMvcConfigurer and HandlerMethodArgumentResolver beans but not #Component, #Service or #Repository beans).
Which means it will only load a subsection of the application.
The code provided shows
#WebMvcTest(controllers = CategoryRest.class)
Which will only load the defined controller and the rest defined in the documentation.
The UserDetailsService is annotated as a #Service which means it will NOT be loaded at startup.
If you want to load the application fully you need to use #SpringBootTest in conjuction with other annotations for instance #AutoConfigureMockMvc or #AutoConfigureWebTestClientdepending on which client to use.
All of this is properly documentated with easy to read instructions in the spring boot documentation testing chapter.

Spring Batch Reader is reading alternate records

I have created a sample spring batch application which is trying to read record from a DB and in writer, it displays those records. However, I could see that only even numbered (alternate) records are printed.
It's not the problem of database as the behavior is consistent with both H2 database or Oracle database.
There are total 100 records in my DB.
With JDBCCursorItemReader, only 50 records are read and that too alternate one as can be seen from log snapshot
With JdbcPagingItemReader, only 5 records are read and that too alternate one as can be seen from log snapshot
My code configurations are given below. Why reader is skipping odd numbered records?
#Bean
public ItemWriter<Safety> safetyWriter() {
return items -> {
for (Safety item : items) {
log.info(item.toString());
}
};
}
#Bean
public JdbcCursorItemReader<Safety> cursorItemReader() throws Exception {
JdbcCursorItemReader<Safety> reader = new JdbcCursorItemReader<>();
reader.setSql("select * from safety " );
reader.setDataSource(dataSource);
reader.setRowMapper(new SafetyRowMapper());
reader.setVerifyCursorPosition(false);
reader.afterPropertiesSet();
return reader;
}
#Bean
JdbcPagingItemReader<Safety> safetyPagingItemReader() throws Exception {
JdbcPagingItemReader<Safety> reader = new JdbcPagingItemReader<>();
reader.setDataSource(dataSource);
reader.setFetchSize(10);
reader.setRowMapper(new SafetyRowMapper());
H2PagingQueryProvider queryProvider = new H2PagingQueryProvider();
queryProvider.setSelectClause("*");
queryProvider.setFromClause("safety");
Map<String, Order> sortKeys = new HashMap<>(1);
sortKeys.put("id", Order.ASCENDING);
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#Bean
public Step importSafetyDetails() throws Exception {
return stepBuilderFactory.get("importSafetyDetails")
.<Safety, Safety>chunk(chunkSize)
//.reader(cursorItemReader())
.reader(safetyPagingItemReader())
.writer(safetyWriter())
.listener(new StepListener())
.listener(new ChunkListener())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.start(importSafetyDetails())
.build();
}
Domain classes looks like below:
#NoArgsConstructor
#AllArgsConstructor
#Data
public class Safety {
private int id;
}
public class SafetyRowMapper implements RowMapper<Safety> {
#Override
public Safety mapRow(ResultSet resultSet, int i) throws SQLException {
if(resultSet.next()) {
Safety safety = new Safety();
safety.setId(resultSet.getInt("id"));
return safety;
}
return null;
}
}
#SpringBootApplication
#EnableBatchProcessing
public class SpringBatchSamplesApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchSamplesApplication.class, args);
}
}
application.yml configuration is as below:
spring:
application:
name: spring-batch-samples
main:
allow-bean-definition-overriding: true
datasource:
url: jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
username: sa
password:
driver-class-name: org.h2.Driver
hikari:
connection-timeout: 20000
maximum-pool-size: 10
h2:
console:
enabled: true
batch:
initialize-schema: never
server:
port: 9090
sqls are as below:
CREATE TABLE safety (
id int NOT NULL,
CONSTRAINT PK_ID PRIMARY KEY (id)
);
INSERT INTO safety (id) VALUES (1);
...100 records are inserted
Listeners classes are as below:
#Slf4j
public class StepListener{
#AfterStep
public ExitStatus afterStep(StepExecution stepExecution) {
log.info("In step {} ,Exit Status: {} ,Read Records: {} ,Committed Records: {} ,Skipped Read Records: {} ,Skipped Write Records: {}",
stepExecution.getStepName(),
stepExecution.getExitStatus().getExitCode(),
stepExecution.getReadCount(),
stepExecution.getCommitCount(),
stepExecution.getReadSkipCount(),
stepExecution.getWriteSkipCount());
return stepExecution.getExitStatus();
}
}
#Slf4j
public class ChunkListener {
#BeforeChunk
public void beforeChunk(ChunkContext context) {
log.info("<< Before the chunk");
}
#AfterChunk
public void afterChunk(ChunkContext context) {
log.info("<< After the chunk");
}
}
I tried to reproduce your problem, but I couldn't. Maybe it would be great if you could share more code.
Meanwhile I created a simple job to read 100 records from "safety" table a print them to the console. And it is working fine.
.
#SpringBootApplication
#EnableBatchProcessing
public class ReaderWriterProblem implements CommandLineRunner {
#Autowired
DataSource dataSource;
#Autowired
StepBuilderFactory stepBuilderFactory;
#Autowired
JobBuilderFactory jobBuilderFactory;
#Autowired
private JobLauncher jobLauncher;
#Autowired
private ApplicationContext context;
public static void main(String[] args) {
String[] arguments = new String[]{LocalDateTime.now().toString()};
SpringApplication.run(ReaderWriterProblem.class, arguments);
}
#Bean
public ItemWriter<Safety> safetyWriter() {
return new ItemWriter<Safety>() {
#Override
public void write(List<? extends Safety> items) throws Exception {
for (Safety item : items) {
//log.info(item.toString());
System.out.println(item);
}
}
};
}
// #Bean
// public JdbcCursorItemReader<Safety> cursorItemReader() throws Exception {
// JdbcCursorItemReader<Safety> reader = new JdbcCursorItemReader<>();
//
// reader.setSql("select * from safety ");
// reader.setDataSource(dataSource);
// reader.setRowMapper(new SafetyRowMapper());
// reader.setVerifyCursorPosition(false);
// reader.afterPropertiesSet();
//
// return reader;
// }
#Bean
JdbcPagingItemReader<Safety> safetyPagingItemReader() throws Exception {
JdbcPagingItemReader<Safety> reader = new JdbcPagingItemReader<>();
reader.setDataSource(dataSource);
reader.setFetchSize(10);
reader.setRowMapper(new SafetyRowMapper());
PostgresPagingQueryProvider queryProvider = new PostgresPagingQueryProvider();
queryProvider.setSelectClause("*");
queryProvider.setFromClause("safety");
Map<String, Order> sortKeys = new HashMap<>(1);
sortKeys.put("id", Order.ASCENDING);
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#Bean
public Step importSafetyDetails() throws Exception {
return stepBuilderFactory.get("importSafetyDetails")
.<Safety, Safety>chunk(5)
//.reader(cursorItemReader())
.reader(safetyPagingItemReader())
.writer(safetyWriter())
.listener(new MyStepListener())
.listener(new MyChunkListener())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.listener(new JobListener())
.start(importSafetyDetails())
.build();
}
#Override
public void run(String... args) throws Exception {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString("date", LocalDateTime.now().toString());
try {
Job job = (Job) context.getBean("job");
jobLauncher.run(job, jobParametersBuilder.toJobParameters());
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException | JobParametersInvalidException e) {
e.printStackTrace();
}
}
public static class JobListener implements JobExecutionListener {
#Override
public void beforeJob(JobExecution jobExecution) {
System.out.println("Before job");
}
#Override
public void afterJob(JobExecution jobExecution) {
System.out.println("After job");
}
}
private static class SafetyRowMapper implements RowMapper<Safety> {
#Override
public Safety mapRow(ResultSet resultSet, int i) throws SQLException {
Safety safety = new Safety();
safety.setId(resultSet.getLong("ID"));
return safety;
}
}
public static class MyStepListener implements StepExecutionListener {
#Override
public void beforeStep(StepExecution stepExecution) {
System.out.println("Before Step");
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
System.out.println("After Step");
return ExitStatus.COMPLETED;
}
}
private static class MyChunkListener implements ChunkListener {
#Override
public void beforeChunk(ChunkContext context) {
System.out.println("Before Chunk");
}
#Override
public void afterChunk(ChunkContext context) {
System.out.println("After Chunk");
}
#Override
public void afterChunkError(ChunkContext context) {
}
}
}
Hope this helps

Spring Boot Isolation.SERIALIZABLE not working

I need help with this scenario, in theory the isolation level of Serializable should stop delete from happening, but in this scenario it still deletes the row with id 1, I have tried #EnableTransactionManagement and isolation repeatable read, it still doesn't block the delete nor cause the delete to throw exception
In summary, I need to stop any delete invocation whenever the update method is still ongoing
I am using H2 in memory database for this sample
Thanks
Entity:
public class Something {
#Id
private Integer id;
private String name;
private String desc;
}
Repo:
public interface SomeRepository extends JpaRepository<Something, Integer> {
}
Service:
#Service
public class SomeService {
#Autowired
private SomeRepository someRepository;
public void deleteSomething2(Something something) {
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
someRepository.delete(something);
}
#Transactional(isolation = Isolation.SERIALIZABLE)
public void updateSomething2(Something something) {
Something something1 = someRepository.findById(1).get();
try {
Thread.sleep(10000);
} catch (InterruptedException e) {
e.printStackTrace();
}
something1.setName("namanama");
someRepository.saveAndFlush(something1);
}
Test:
#RunWith(SpringRunner.class)
#SpringBootTest
public class DemoApplicationTests {
#Autowired
private SomeService service;
#Test
public void test() throws Exception {
ExecutorService executorService = Executors.newFixedThreadPool(10);
List<Future> futures = new ArrayList<>();
futures.add(executorService.submit(() -> service.updateSomething2(Something.builder().id(1).name("namaone").build())));
futures.add(executorService.submit(() -> service.deleteSomething2(Something.builder().id(1).build())));
while(futures.stream().filter(f -> f.isDone() == false).count() > 0) {
Thread.sleep(3000);
}
List<Something> all = service.findAll();
System.out.println(all);
}
}

Logging requests and responses in Spring

I'm trying to implement logging system in a Spring boot application. There are requests coming into the system which have one or more responses.
Requests and responses must be logged into the database in a separate thread, not in the worker thread.
This is my idea.
tables in mysql - "request" with required columns, and "response" with request_id as foreign key
relation between resquest and response - one to many.
A separate thread in LogService is started in #PostContruct to save the data in the DB.
I'm sure there are better solutions to this problem. Please guide with some suggestions.
#Service
public class LogServiceImpl implements LogService {
private final BlockingQueue<Object> logQueue = new LinkedBlockingQueue<>();
private volatile boolean done;
// repositories
#Autowired
private RequestRepository requestRepository;
#Autowired
private ResponseRepository responseRepository;
#Async
#Override
public void log(Object obj) {
try {
logQueue.put(obj);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
#PostContruct
private saveToDb(){
new Thread(() -> {
while(!done){
String object = logQueue.poll(5, TimeUnit.SECONDS)
if(object != null){
if(object instanceof Request){
requestRepository.save((Request)object);
}
if(object instanceof Response){
responseRepository.save((Response)object);
}
}
}
}).start();
}
public void stop() {
done = true;
}
}
class Request{
.....
}
class Response{
......
}
#Service
public class SomeService1 {
#Autowired
private LogService logService;
public void someMeth1(Request request) {
....
logService.log(request);
}
}
#Service
public class SomeService2 {
#Autowired
private LogService logService;
public void someMeth2(Response response) {
....
logService.log(response);
}
}

Transactions with Guice and JDBC - Solution discussion

In my application, I need to use pure JDBC together with Guice. However, Guice doesn't provide any built-in support to manage transactions. guice-persist only provides support based on JPA, which I cannot use.
so I tried to implement a simple solution to manage transactions with Guice and JDBC. here is the first version:
use TransactionHolder to store the transaction per thread.
public class JdbcTransactionHolder {
private static ThreadLocal<JdbcTransaction> currentTransaction = new ThreadLocal<JdbcTransaction>();
public static void setCurrentTransaction(JdbcTransaction transaction) {
currentTransaction.set(transaction);
}
public static JdbcTransaction getCurrentTransaction() {
return currentTransaction.get();
}
public static void removeCurrentTransaction() {
currentTransaction.remove();
}
}
implements a transaction manager for JDBC, for now only begin(), getTransaction(), commit() and rollback() method:
public class JdbcTransactionManager implements TransactionManager {
#Inject
private DataSource dataSource;
#Override
public void begin() throws NotSupportedException, SystemException {
logger.debug("Start the transaction");
try {
JdbcTransaction tran = JdbcTransactionHolder.getCurrentTransaction();
Connection conn = null;
if(tran == null) {
conn = dataSource.getConnection();
}
else {
conn = tran.getConnection();
}
// We have to put the connection in the holder so that we can get later
// from the holder and use it in the same thread
logger.debug("Save the transaction for thread: {}.", Thread.currentThread());
JdbcTransactionHolder.setCurrentTransaction(new JdbcTransaction(conn));
} catch (Exception e) {
throw new RuntimeException(e);
}
}
#Override
public void commit() throws RollbackException, HeuristicMixedException,
HeuristicRollbackException, SecurityException,
IllegalStateException, SystemException {
logger.debug("Commit the transaction");
try {
logger.debug("Get the connection for thread: {}.", Thread.currentThread());
Transaction transaction = JdbcTransactionHolder.getCurrentTransaction();
transaction.commit();
}
catch(Exception e) {
throw new RuntimeException(e);
}
finally {
JdbcTransactionHolder.removeCurrentTransaction();
}
}
#Override
public Transaction getTransaction() throws SystemException {
logger.debug("Get transaction.");
final JdbcTransaction tran = JdbcTransactionHolder.getCurrentTransaction();
if(tran == null) {
throw new DBException("No transaction is availble. TransactionManager.begin() is probably not yet called.");
}
return tran;
}
#Override
public void rollback() throws IllegalStateException, SecurityException,
SystemException {
logger.debug("Rollback the transaction");
try {
logger.debug("Get the transaction for thread: {}.", Thread.currentThread());
Transaction conn = JdbcTransactionHolder.getCurrentTransaction();
conn.commit();
}
catch(Exception e) {
throw new RuntimeException(e);
}
finally {
JdbcTransactionHolder.removeCurrentTransaction();
}
}
}
implement a wrapper for DataSource which can get the current connection from the transaction holder if a transaction has been started:
public class JdbcDataSource implements DataSource {
private final static org.slf4j.Logger logger = LoggerFactory.getLogger(JdbcDataSource.class);
private DataSource dataSource;
public JdbcDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
#Override
public PrintWriter getLogWriter() throws SQLException {
return dataSource.getLogWriter();
}
#Override
public int getLoginTimeout() throws SQLException {
return dataSource.getLoginTimeout();
}
#Override
public Logger getParentLogger() throws SQLFeatureNotSupportedException {
return dataSource.getParentLogger();
}
#Override
public void setLogWriter(PrintWriter out) throws SQLException {
this.dataSource.setLogWriter(out);
}
#Override
public void setLoginTimeout(int seconds) throws SQLException {
this.dataSource.setLoginTimeout(seconds);
}
#Override
public boolean isWrapperFor(Class<?> arg0) throws SQLException {
return this.isWrapperFor(arg0);
}
#Override
public <T> T unwrap(Class<T> iface) throws SQLException {
return this.unwrap(iface);
}
#Override
public Connection getConnection() throws SQLException {
JdbcTransaction transaction = JdbcTransactionHolder.getCurrentTransaction();
if(transaction != null) {
// we get the connection from the transaction
logger.debug("Transaction exists for the thread: {}.", Thread.currentThread());
return transaction.getConnection();
}
Connection conn = this.dataSource.getConnection();
conn.setAutoCommit(false);
return conn;
}
#Override
public Connection getConnection(String username, String password)
throws SQLException {
JdbcTransaction transaction = JdbcTransactionHolder.getCurrentTransaction();
if(transaction != null) {
// we get the connection from the transaction
logger.debug("Transaction exists for the thread: {}.", Thread.currentThread());
return transaction.getConnection();
}
return this.dataSource.getConnection(username, password);
}
}
then create a DataSourceProvider so that we can inject DataSource to any POJO using guice:
public class DataSourceProvider implements Provider {
private static final Logger logger = LoggerFactory.getLogger(DataSourceProvider.class);
private DataSource dataSource;
public DataSourceProvider() {
JdbcConfig config = getConfig();
ComboPooledDataSource pooledDataSource = new ComboPooledDataSource();
try {
pooledDataSource.setDriverClass(config.getDriver());
} catch (Exception e) {
throw new RuntimeException(e);
}
pooledDataSource.setJdbcUrl(config.getUrl());
pooledDataSource.setUser(config.getUsername());
pooledDataSource.setPassword(config.getPassword() );
pooledDataSource.setMinPoolSize(config.getMinPoolSize());
pooledDataSource.setAcquireIncrement(5);
pooledDataSource.setMaxPoolSize(config.getMaxPoolSize());
pooledDataSource.setMaxStatements(config.getMaxStatementSize());
pooledDataSource.setAutoCommitOnClose(false);
this.dataSource = new JdbcDataSource(pooledDataSource);
}
private JdbcConfig getConfig() {
JdbcConfig config = new JdbcConfig();
Properties prop = new Properties();
try {
//load a properties file from class path, inside static method
prop.load(JdbcConfig.class.getResourceAsStream("/database.properties"));
//get the property value and print it out
config.setDriver(prop.getProperty("driver"));
config.setUrl(prop.getProperty("url"));
config.setUsername(prop.getProperty("username"));
config.setPassword(prop.getProperty("password"));
String maxPoolSize = prop.getProperty("maxPoolSize");
if(maxPoolSize != null) {
config.setMaxPoolSize(Integer.parseInt(maxPoolSize));
}
String maxStatementSize = prop.getProperty("maxStatementSize");
if(maxStatementSize != null) {
config.setMaxStatementSize(Integer.parseInt(maxStatementSize));
}
String minPoolSize = prop.getProperty("minPoolSize");
if(minPoolSize != null) {
config.setMinPoolSize(Integer.parseInt(minPoolSize));
}
}
catch (Exception ex) {
logger.error("Failed to load the config file!", ex);
throw new DBException("Cannot read the config file: database.properties. Please make sure the file is present in classpath.", ex);
}
return config;
}
#Override
public DataSource get() {
return dataSource;
}
and then implement TransactionalMethodInterceptor to manage the transaction for the method with Transactional annotation:
public class TransactionalMethodInterceptor implements MethodInterceptor {
private final static Logger logger = LoggerFactory.getLogger(TransactionalMethodInterceptor.class);
#Inject
private JdbcTransactionManager transactionManager;
#Override
public Object invoke(MethodInvocation method) throws Throwable {
try {
// Start the transaction
transactionManager.begin();
logger.debug("Start to invoke the method: " + method);
Object result = method.proceed();
logger.debug("Finish invoking the method: " + method);
transactionManager.commit();
return result;
} catch (Exception e) {
logger.error("Failed to commit transaction!", e);
try {
transactionManager.rollback();
}
catch(Exception ex) {
logger.warn("Cannot roll back transaction!", ex);
}
throw e;
}
}
}
Finally, the code to put all together so that Guice can inject the instances:
bind(DataSource.class).toProvider(DataSourceProvider.class).in(Scopes.SINGLETON);
bind(TransactionManager.class).to(JdbcTransactionManager.class);
TransactionalMethodInterceptor transactionalMethodInterceptor = new TransactionalMethodInterceptor();
requestInjection(transactionalMethodInterceptor);
bindInterceptor(Matchers.any(), Matchers.annotatedWith(Transactional.class), transactionalMethodInterceptor);
bind(TestDao.class).to(JdbcTestDao.class);
bind(TestService.class).to(TestServiceImpl.class);
I use c3p0 for the datasource pool. so, it works just fine in my test.
I find another related question: Guice, JDBC and managing database connections
but so far I haven't find any similar approach, except something in SpringFramework. but even the implementation in Spring seems quite complex.
I would like to ask if anyone has any suggestion for this solution.
thanks.

Resources