#Transactional annotation with spring and getting current co - spring

I have a method which has a UPDATE query and a select query. I annotated the method with #Transactional for the below use case.
For concurrent executions - if two users are updating the same table ,I need the update and select query to be run as a unit
If not using #Transactional , I am using jdbc template and i am trying to get the current connection set auto commit to false and commit to true at the end of the method
Issue 1:
Update is getting commited immediately after the statement is executed
Issue 2:
With jdbc template , unable to get the current connection used for transaction .
I have tried the below two ways to get the current connection , but it seems to be a new connection from the pool
1.Connection conn = DataSourceUtils.getConnection(template.getDataSource());
2.Connection con=template.getDataSource().getConnection();
Application deployed in WebLogic Server using using java configuration , created bean for jdbc template , datasource and transaction management and used autowiring
#Transactional
public Integer getNextId(String tablename) {
Integer nextId = null;
int swId = template.update("update " + tablename + " set swId = swId + 1");
//int swId1 = template.update("update " + tablename + " set swId = swId + 1");
if (swId == 1) {
nextId = template.queryForObject("select swId from " + tablename,
int.class);
}
return nextId;
}
}
#Scope("prototype")
#Bean
public DataSource dataSource() throws NamingException {
javax.naming.InitialContext ctx = new javax.naming.InitialContext();
DataSource dataSource = (javax.sql.DataSource) ctx.lookup(dsnName);
return dataSource;
}
#Scope("prototype")
#Bean
public DataSourceTransactionManager dataSourceTransactionManager(DataSource dataSource) {
DataSourceTransactionManager dataSourceTransactionManager = new DataSourceTransactionManager();
dataSourceTransactionManager.setDataSource(dataSource);
return dataSourceTransactionManager;
}
#Scope("prototype")
#Bean
public JdbcTemplate template(DataSource dataSource) {
JdbcTemplate template = new JdbcTemplate(dataSource);
return template;
}
Expected results.
Commit should happen after all the statements in the method is executed
jdbc template need to get the active connection sed for current transaction

Related

Mybatis + Spring Boot : #Update is not working

I am trying to do an update using #update annotation. The query triggers fine without any exceptions but method returns 0 every time (0 row updated). No update is happening in the db. and same query is working fine from SQLdeveloper tool.
Using Oracle db.
#Update(
"UPDATE extra.EMMT SET CASE_STATUS = #{updateBean.CASE_STATUS}, CASE_STATUS_TimeStmp = #{updateBean.CASE_STATUS_TimeStmp} WHERE T_TimeStmp >= #{updateBean.LAST_T_TimeStmp} AND T_TimeStmp <= #{updateBean.T_TimeStmp} AND T_NO = #{updateBean.T_NO} AND EM_NO = #{updateBean.EM_NO}"
)
public long update(#Param("updateBean") EMMT updateBean);
"EMMT updateBean" is class has same members as the columns the table EMMT.
and I also tried creating two different sessions one for update and other is for insert, but didn't help much.
Session config.
#Bean(name = "updatesession")
public SqlSessionFactory sqlSessionFactoryupdate() throws Exception {
SqlSessionFactoryBean factoryBean = new SqlSessionFactoryBean();
factoryBean.setDataSource(dataSource);
SqlSessionFactory sqlSessionFactory = factoryBean.getObject();
sqlSessionFactory.getConfiguration().setJdbcTypeForNull(JdbcType.NULL);
sqlSessionFactory.getConfiguration().setDefaultStatementTimeout(15);
sqlSessionFactory.getConfiguration().addMappers("com.xyz.myapp.mapper");
return sqlSessionFactory;
}
Using Mybatis-spring - 2.2.0
<dependency>
<groupId>org.mybatis.spring.boot</groupId>
<artifactId>mybatis-spring-boot-starter</artifactId>
<version>2.2.0</version>
</dependency>
Any help would be apricated.
Thanks.
EXAMPLE -
configuration for sqlSession is given above.
bean class.
class justForUpdate {
String CASE_STATUS;
String EM_NO ;
Timestamp T_TimeStmp;
Long T_NO ;
Timestamp CASE_STATUS_TimeStmp;
Timestamp LAST_T_TimeStmp;
}
service class
updateservice {
#Autowired
private SqlSessionFactory sessions;
public void work() {
//obj of justForUpdate = auth
//or can pass list of justForUpdate objs.
try(SqlSession session = sessions.openSession(true)){
Update_mapper upd = session.getMapper(Update_mapper.class);
long val = upd.update(auth);
System.out.print(">>>>>>>>> "+val);
}
}
}
Update_Mapper
#Mapper
public interface Update_mapper {
#Update(
"UPDATE extra.EMMT SET CASE_STATUS = #{updateBean.CASE_STATUS}, CASE_STATUS_TimeStmp = #{updateBean.CASE_STATUS_TimeStmp} WHERE T_TimeStmp >= #{updateBean.LAST_T_TimeStmp} AND T_TimeStmp <= #{updateBean.T_TimeStmp} AND T_NO = #{updateBean.T_NO} AND EM_NO = #{updateBean.EM_NO}"
)
public long update(#Param("updateBean") EMMT updateBean);
}
For anyone else who might bump into this same stupid problem.
the quick fix was using "TRIM()" in my query with column name.
Some columns were defined as 40 bytes. hence containing numbers of spaces.
#Update(
"UPDATE extra.EMMT SET CASE_STATUS = #{updateBean.CASE_STATUS}, CASE_STATUS_TimeStmp = #{updateBean.CASE_STATUS_TimeStmp} WHERE T_TimeStmp >= #{updateBean.LAST_T_TimeStmp} AND T_TimeStmp <= #{updateBean.T_TimeStmp} AND TRIM(T_NO) = #{updateBean.T_NO} AND TRIM(EM_NO) = #{updateBean.EM_NO}")
public long update(#Param("updateBean") EMMT updateBean);
}

Is there a way to make a transactional action in a #Scheduled method?

So, my application is multi-tenants based, and I need to apply a transactional request each Sunday at 1:00 PM.
It basically needs to get All resources and create Usage (which is an entity per week) based on the actual Date.
My transactional Method :
/**
* Scheduled to run at 1:00 every sunday
* It should create capacities for all resources that doesn't have any for next years
*/
#Transactional
public void createNewCapacities() throws Exception {
LocalDate now = LocalDate.now();
System.out.println("Date is :" + now);
System.out.println("Start of capacities creation...");
//List<Resource> resources = resourceService.findAll();
//for(Resource resource : resources){
//Calendar calendar = calendarService.findCalendarById(resource.getSelectedCalendarId());
...
//}
}
My scheduler :
#Service
public class UsageServiceScheduler {
#Autowired
private UsageService usageService;
#Scheduled(cron= "0 0 1 * * SUN")
public void callScheduledTask() throws Exception {
usageService.createNewCapacities();
}
}
This throw an Exception:
org.springframework.transaction.CannotCreateTransactionException:
Could not open JPA EntityManager for transaction; nested exception is
java.lang.IllegalStateException: Cannot determine target DataSource
for lookup key [null].
Is there a way to establish a connection with the database during the #Scheduled method?
EDIT :
I have both #Transactional and #Scheduled enabled.
TENANT DATA SOURCE PROPERTIES :
#Component
#ConfigurationProperties(prefix = "tenants")
public class TenantDataSourceProperties {
private Map<Object, Object> dataSources = new LinkedHashMap<>();
public Map<Object, Object> getDataSources() {
return dataSources;
}
public void setDataSources(Map<String, Map<String, String>> dataSources) {
dataSources.forEach((key, value) -> this.dataSources.put(key, convert(value)));
}
public DataSource convert(Map<String, String> source) {
return DataSourceBuilder.create()
.url(source.get("jdbcUrl"))
.driverClassName(source.get("driverClassName"))
.username(source.get("username"))
.password(source.get("password"))
.build();
}
}
I want my cron job to run for all existing tenants.
Or in other words, get all the dataSources and apply cron job for each db.

JdbcPollingChannelAdapter and IntegrationFlow No rolling back the update when an exception occurs in the Integraion flow messages

My use case is that I have got a spring boot application with a JdbcPollingChannelAdapter to fetch data from a postgresql database, updating the fetched rows and moving foreward with message flow (using IntegrationFlowBuilder) to process some transform to the ResultSet and publish the results to RabbitMQ.
JdbcPollingChannelAdapter is configured to fetch data each 60 seonds with a select for update query followed by an update query to flag the status form NEW to PUBLISH status:
The sql query :select * from table where status= 'NEW' order by tms_creation limit 100 for update;
The update query : update table set cod_etat = 'PUBLISH', tms_modification = now() where id in (:id)
Also, there is no Max Row per Poll to fetch data, which means that the jdbc poller will execute the sql request as many time as data (with status NEW) is present.
First issue: I stop my RabbitMQ and let my microservice running, the JdbcPollingChannelAdapter fetch the first ResultSet pass them through the Message flow and process the update. The message flow process the resultSet to send them through a channel to rabbitMQ(using spring cloud stream). The send fail and no Rollback has occured which means that the resultSet has been flagged as published.
I Have been loking around in documentation to figure out what I have missed. So any help would be appreciate.
Second issue: I run 3 instances of my application on PCF, and handle the concurrent access to the rows in the datable. My transaction and the select for update query in The JdbcPollingChannelAdapter suppose to get Row-level Lock Modes for the current transaction as per sql query (select for update). But what is happening is that more than one instance could get the same rows which is supposed to be managed by the current lock. Thus, it leads to multiple instances handling the same data and publishing them multiple times.
My code is as
#EnableConfigurationProperties(ProprietesSourceJdbc.class)
#Component
public class KafkaGuy {
private static final Logger LOG = LoggerFactory.getLogger(KafkaGuy.class);
private ProprietesSourceJdbc proprietesSourceJdbc;
private DataSource sourceDeDonnees;
private DemandeSource demandeSource;
private ObjectMapper objectMapper;
private JdbcTemplate jdbcTemplate;
public KafkaGuy(ProprietesSourceJdbc proprietesSourceJdbc, DemandeSource demandeSource, DataSource dataSource, JdbcTemplate jdbcTemplate, ObjectMapper objectMapper) {
this.proprietesSourceJdbc = proprietesSourceJdbc;
this.demandeSource = demandeSource;
this.sourceDeDonnees = dataSource;
this.objectMapper = objectMapper;
this.jdbcTemplate = jdbcTemplate;
}
#Bean
public MessageSource<Object> jdbcSourceMessage() {
JdbcPollingChannelAdapter jdbcSource = new JdbcPollingChannelAdapter(this.sourceDeDonnees, this.proprietesSourceJdbc.getQuery());
jdbcSource.setUpdateSql(this.proprietesSourceJdbc.getUpdate());
return jdbcSource;
}
#Bean
public IntegrationFlow fluxDeDonnees() {
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcSourceMessage());
flowBuilder
.split()
.log(LoggingHandler.Level.INFO, message ->
message.getHeaders().get("sequenceNumber")
+ " événements publiés sur le bus de message sur "
+ message.getHeaders().get("sequenceSize")
+ " événements lus (lot)")
.transform(Transformers.toJson())
.enrichHeaders(h -> h.headerExpression("type", "payload.typ_evenement"))
.publishSubscribeChannel(publishSubscribeSpec -> publishSubscribeSpec
.subscribe(flow -> flow
.transform(Transformers.toJson())
.transform(kafkaGuyTransformer())
.channel(this.demandeSource.demandePreinscriptionOuput()))
);
return flowBuilder.get();
}
#Bean
public KafkaGuyTransformer kafkaGuyTransformer() {
return new KafkaGuyTransformer();
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata defaultPoller() {
PollerMetadata pollerMetadata = new PollerMetadata();
PeriodicTrigger trigger = new PeriodicTrigger(this.proprietesSourceJdbc.getTriggerDelay(), TimeUnit.SECONDS);
pollerMetadata.setTrigger(trigger);
pollerMetadata.setMaxMessagesPerPoll(proprietesSourceJdbc.getMaxRowsPerPoll());
return pollerMetadata;
}
public class KafkaGuyTransformer implements GenericTransformer<Message, Message> {
#Override
public Message transform(Message message) {
Message<String> msg = null;
try {
DemandeRecueDTO dto = objectMapper.readValue(message.getPayload().toString(), DemandeRecueDTO.class);
msg = MessageBuilder.withPayload(dto.getTxtDonnee())
.copyHeaders(message.getHeaders())
.build();
} catch (Exception e) {
LOG.error(e.getMessage(), e);
}
return msg;
}
}
}
I am new In spring integration and sorry if is not well explained. Any help is appreciate.
Everything looks good and should be as you have described. Only the problem I see that there is no transaction configured for the IntegrationFlows.from(jdbcSourceMessage()).
Consider to PollerMetadata.setAdviceChain() with a TransactionInterceptor.
Another way is to use a PollerSpec with its transactional() option.
This way you won't use local data base transactions which are committed exactly after return from the ResultSet processing. With transaction on the application level there is not going to be a commit until you exit a thread.

How to connect to a postgresql database "on the fly" using Spring Boot

I know how to connect to a database in usual way. What I need is to choose what database to connect at runtime.
I have a default database (used by the system) and many other options to user choose to aquire data. It implies in have no Entities or JPA mappings any way.
In older times I was using this:
try (Connection connection = DriverManager.getConnection(connectionString, user, password);
PreparedStatement preparedStatement = connection.prepareStatement(nativeQuery)) {
preparedStatement.setString( 1, coordinate );
try (ResultSet resultSet = preparedStatement.executeQuery()) {
while (resultSet.next())
result = resultSet.getString(RESULT_PARAM);
}
} catch (SQLException ex) {
CodeUtils.log(QUERY_ERROR_MSG, this);
CodeUtils.log(ex.getMessage(), this);
}
But I don't know how to port this to Spring Boot.
You can define database configuration as below-
#Configuration
public class MyDBConfig {
#Bean("db1Ds")
#Primary
#ConfigurationProperties("app.ds.db1")
public DataSource db1DataSource() {
return DataSourceBuilder.create().build();
}
#Bean("db2Ds")
#ConfigurationProperties("app.ds.db2")
public DataSource db2DataSource() {
return DataSourceBuilder.create().build();
}
#Bean("db1JdbcTemplate")
#Autowired
public JdbcTemplate db1JdbcTemplate(#Qualifier("db1Ds") DataSource ds) {
return new JdbcTemplate(ds);
}
#Bean("db2JdbcTemplate")
#Autowired
public JdbcTemplate db2JdbcTemplate(#Qualifier("db2Ds") DataSource ds) {
return new JdbcTemplate(ds);
}
}
Switch the Jdbc template based on what user selects.
Official Doc:
https://docs.spring.io/spring-boot/docs/current/reference/html/howto-data-access.html#howto-two-datasources

Spring - jdbcTemplate

I'm just beginning with Spring framework. I'm also using DBCP pooling and i'm still not sure how to work right with jdbcTemplate.
It is best practice to reuse created/injected jdbcTemplate instance between multiple DAOs or it is right to create jdbcTemplate for each DAO ?
I'm currently using annotation approach:
public class FooDAO {
private JdbcTemplate jdbcTemplate;
#Autowired
public void setDatasource( DataSource dataSource ) {
this.jdbcTemplate = new JdbcTemplate( dataSource );
}
}
I'm aware about JdbcDaoSupport, but I don't know how to inject datasource, because method setDatasource is marked as final.
But still, I'm not sure if is best practice to reuse created jdbcTemplate or not.
Inject it in and share it. Don't call "new"; that takes control out of the hands of the Spring bean factory.
I'm aware about JdbcDaoSupport, but I don't know how to inject datasource, because method setDatasource is marked as final.
public class JdbcDaoSupportTest extends JdbcDaoSupport {
public void insert() {
this.getJdbcTemplate().execute("insert into tb_test1 values(1,'ycl','123')");
System.out.println("complete...");
}
}
Spring call set Method, don't care whether the method is final or not.
<bean id="jdbcDaoSupportTest" class="com.xxxxx.JdbcDaoSupportTest">
<property name="dataSource" ref="dataSource" />
</bean>
then in your JdbcDaoSupportTest, you can call this.getJdbcTemplate() to get JdbcTemplate do
any operator.
try {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
String sql = "select user.id as id,user.roll_no as createdId,user.name as name,user.type as company,role.role as year "
+ "from user_role join user on user.id=user_role.user_id "
+ "join role on role.id=user_role.role_id "
+ "where (user.status='ACTIVE' or user.status='ADMIN') AND user.username='" + userName + "'";
UserVo userDetails = jdbcTemplate.queryForObject(sql, new BeanPropertyRowMapper<UserVo>(UserVo.class));
or
Long company = jdbcTemplate.queryForObject(sql, Long.class);
or
List<UserVo> users = jdbcTemplate.query(sql, new BeanPropertyRowMapper<UserVo>(UserVo.class));
logger.info("Retrieve user details by username");
return userDetails;
} catch (Exception e) {
logger.error("error in getting UserDetails using UserName", e);
}

Resources