Mybatis + Spring Boot : #Update is not working - oracle

I am trying to do an update using #update annotation. The query triggers fine without any exceptions but method returns 0 every time (0 row updated). No update is happening in the db. and same query is working fine from SQLdeveloper tool.
Using Oracle db.
#Update(
"UPDATE extra.EMMT SET CASE_STATUS = #{updateBean.CASE_STATUS}, CASE_STATUS_TimeStmp = #{updateBean.CASE_STATUS_TimeStmp} WHERE T_TimeStmp >= #{updateBean.LAST_T_TimeStmp} AND T_TimeStmp <= #{updateBean.T_TimeStmp} AND T_NO = #{updateBean.T_NO} AND EM_NO = #{updateBean.EM_NO}"
)
public long update(#Param("updateBean") EMMT updateBean);
"EMMT updateBean" is class has same members as the columns the table EMMT.
and I also tried creating two different sessions one for update and other is for insert, but didn't help much.
Session config.
#Bean(name = "updatesession")
public SqlSessionFactory sqlSessionFactoryupdate() throws Exception {
SqlSessionFactoryBean factoryBean = new SqlSessionFactoryBean();
factoryBean.setDataSource(dataSource);
SqlSessionFactory sqlSessionFactory = factoryBean.getObject();
sqlSessionFactory.getConfiguration().setJdbcTypeForNull(JdbcType.NULL);
sqlSessionFactory.getConfiguration().setDefaultStatementTimeout(15);
sqlSessionFactory.getConfiguration().addMappers("com.xyz.myapp.mapper");
return sqlSessionFactory;
}
Using Mybatis-spring - 2.2.0
<dependency>
<groupId>org.mybatis.spring.boot</groupId>
<artifactId>mybatis-spring-boot-starter</artifactId>
<version>2.2.0</version>
</dependency>
Any help would be apricated.
Thanks.
EXAMPLE -
configuration for sqlSession is given above.
bean class.
class justForUpdate {
String CASE_STATUS;
String EM_NO ;
Timestamp T_TimeStmp;
Long T_NO ;
Timestamp CASE_STATUS_TimeStmp;
Timestamp LAST_T_TimeStmp;
}
service class
updateservice {
#Autowired
private SqlSessionFactory sessions;
public void work() {
//obj of justForUpdate = auth
//or can pass list of justForUpdate objs.
try(SqlSession session = sessions.openSession(true)){
Update_mapper upd = session.getMapper(Update_mapper.class);
long val = upd.update(auth);
System.out.print(">>>>>>>>> "+val);
}
}
}
Update_Mapper
#Mapper
public interface Update_mapper {
#Update(
"UPDATE extra.EMMT SET CASE_STATUS = #{updateBean.CASE_STATUS}, CASE_STATUS_TimeStmp = #{updateBean.CASE_STATUS_TimeStmp} WHERE T_TimeStmp >= #{updateBean.LAST_T_TimeStmp} AND T_TimeStmp <= #{updateBean.T_TimeStmp} AND T_NO = #{updateBean.T_NO} AND EM_NO = #{updateBean.EM_NO}"
)
public long update(#Param("updateBean") EMMT updateBean);
}

For anyone else who might bump into this same stupid problem.
the quick fix was using "TRIM()" in my query with column name.
Some columns were defined as 40 bytes. hence containing numbers of spaces.
#Update(
"UPDATE extra.EMMT SET CASE_STATUS = #{updateBean.CASE_STATUS}, CASE_STATUS_TimeStmp = #{updateBean.CASE_STATUS_TimeStmp} WHERE T_TimeStmp >= #{updateBean.LAST_T_TimeStmp} AND T_TimeStmp <= #{updateBean.T_TimeStmp} AND TRIM(T_NO) = #{updateBean.T_NO} AND TRIM(EM_NO) = #{updateBean.EM_NO}")
public long update(#Param("updateBean") EMMT updateBean);
}

Related

how do I get spring to recognize the first capital letter using postgresql

I have a question how I can do so that in spring my table is recognized with the first capital letter, is made in postgresql
#Entity
#Table(name="System_user")
public class Dashboard {
#Id
#Column(name = "username")
String username;
get..
set...
}
it generates an error and I change it as shown below
#Table("\"System_user\"")
but he still doesn't recognize me
As we discussed in the comments, the issue was in dialect.
Latest dialect for now - org.hibernate.dialect.PostgreSQL95Dialect
The issue solved this dialect:
<prop key="hibernate.dialect">org.hibernate.dialect.PostgreSQLDialect</prop>
Answer to your HQL-code:
I would suggest using TypedQuery as it returns not just a list, but list of your object.
Furthermore, u are using the same session to transfer data. This is bad for the program. Always do like Open session - transfer data - close session.
Try this code:
public List<Vw_dh_assay> listAssayReport(double year, String project_id, String hole_id) {
List<Vw_dh_assay> list = new ArrayList<>();
Session session;
try {
session = this.sessionFactory.openSeesion();
TypedQuery<Vw_dh_assay> query = session.createQuery("from Vw_dh_assay where year = :year AND project_id = :project_id AND hole_id = :hole_id", Player.class);
query.setParameter("year", year);
query.setParameter("project_id", project_id);
query.setParameter("hole_id", hole_id);
list = query.getResultList();
System.out.println(list);
session.clear();
session.close();
} catch (Exception e) {
e.printStackTrace();
}
return list;
}
this is my code DAO #Antonio112009
#Override
public List<Vw_dh_assay> listAssayReport(double year, String project_id, String hole_id) {
Session session = this.sessionFactory.getCurrentSession();
String query = ("SELECT DISTINCT year,project_id,hole_id from "
+ "Vw_dh_assay where year = :year AND project_id = :project_id AND hole_id = :hole_id");
List result = session.createQuery(query).setParameter("year", year).setParameter("project_id", project_id)
.setParameter("hole_id", hole_id).list();
System.out.println(result);
return result;
}

#Transactional annotation with spring and getting current co

I have a method which has a UPDATE query and a select query. I annotated the method with #Transactional for the below use case.
For concurrent executions - if two users are updating the same table ,I need the update and select query to be run as a unit
If not using #Transactional , I am using jdbc template and i am trying to get the current connection set auto commit to false and commit to true at the end of the method
Issue 1:
Update is getting commited immediately after the statement is executed
Issue 2:
With jdbc template , unable to get the current connection used for transaction .
I have tried the below two ways to get the current connection , but it seems to be a new connection from the pool
1.Connection conn = DataSourceUtils.getConnection(template.getDataSource());
2.Connection con=template.getDataSource().getConnection();
Application deployed in WebLogic Server using using java configuration , created bean for jdbc template , datasource and transaction management and used autowiring
#Transactional
public Integer getNextId(String tablename) {
Integer nextId = null;
int swId = template.update("update " + tablename + " set swId = swId + 1");
//int swId1 = template.update("update " + tablename + " set swId = swId + 1");
if (swId == 1) {
nextId = template.queryForObject("select swId from " + tablename,
int.class);
}
return nextId;
}
}
#Scope("prototype")
#Bean
public DataSource dataSource() throws NamingException {
javax.naming.InitialContext ctx = new javax.naming.InitialContext();
DataSource dataSource = (javax.sql.DataSource) ctx.lookup(dsnName);
return dataSource;
}
#Scope("prototype")
#Bean
public DataSourceTransactionManager dataSourceTransactionManager(DataSource dataSource) {
DataSourceTransactionManager dataSourceTransactionManager = new DataSourceTransactionManager();
dataSourceTransactionManager.setDataSource(dataSource);
return dataSourceTransactionManager;
}
#Scope("prototype")
#Bean
public JdbcTemplate template(DataSource dataSource) {
JdbcTemplate template = new JdbcTemplate(dataSource);
return template;
}
Expected results.
Commit should happen after all the statements in the method is executed
jdbc template need to get the active connection sed for current transaction

IllegalStateException: Cannot convert value of type 'java.sql.Timestamp' to required type 'java.time.LocalDateTime' for property

I'm working on an spring-boot/jpa/ mysql project. Now so far everything worked with DateTime objects when fetching/storing objects with the repository.
The problem has now occured when I use the Jdbc Template to execute a custom sql query.
org.springframework.beans.ConversionNotSupportedException: Failed to convert property
value of type 'java.sql.Timestamp' to required type java.time.LocalDateTime' for
property 'from_time': no matching editors or conversion strategy found
The idea is to fetch Time slots (has a start time and duration in minutes) that are overlapping with a new incoming entry.
To get back my objects I was first using a BeanPropertyMapper and then switched to a custom NestedRowMapper.
The resulting conflicting time slots I want to get look like this:
{
id: 1
comment: "i worked 60minutes"
from_time: "2018-06-16 13:00"
duration_minutes: 60
task: {
name: "My task"
...
}
}
This is the method where I run into the issue:
public List<TimeSlot> getOverlappingEntries(TimeSlot timeslot) throws SQLException {
String sql = "SELECT time_slot.comment, time_slot.from_time,"
+ "DATE_ADD(from_time, INTERVAL duration_minutes MINUTE) AS end_time, "
+ " task.name as `task.name`, task.category as `task.category` "
+ " FROM `time_slot` " + " INNER JOIN task on task.id = time_slot.task_id "
+ " WHERE person_id = ? "
+ " HAVING ? < end_time AND DATE_ADD(? ,INTERVAL ? MINUTE) > from_time;";
PreparedStatementCreator prepared = (con) -> {
PreparedStatement prep = con.prepareStatement(sql);
prep.setObject(1, timeslot.person.id);
prep.setObject(2, timeslot.from_time);
prep.setObject(3, timeslot.from_time);
prep.setObject(4, timeslot.durationMinutes);
logger.info(prep.toString());
return prep;
};
return this.connector.query(prepared, NestedRowMapper.get(TimeSlot.class));
}
Now I would imagine spring is capable of converting those objects easily. And anyway there is the simple way of timestamp.toLocalDateTime() to do so. The problem seems more how to register this as a converter service or how to fix spring-boot configuration to do so.
I tried already a custom converter service but that didn't help:
#javax.persistence.Converter
public class SqlTimestampToLocalDateTimeConverter implements Converter<Timestamp,
LocalDateTime>, AttributeConverter<Timestamp, LocalDateTime> {
#Convert
#Override
public LocalDateTime convert(Timestamp source) {
return source.toLocalDateTime();
}
#Override
public LocalDateTime convertToDatabaseColumn(Timestamp attribute) {
return attribute.toLocalDateTime();
}
#Override
public Timestamp convertToEntityAttribute(LocalDateTime dbData) {
return Timestamp.valueOf(dbData);
}
}
Also many other answers on the internet mentioned that this was already implemented with spring framework 4.x.
The dependencies in the project look like this (build.gradle):
dependencies {
compile "org.springframework.boot:spring-boot-starter-thymeleaf:2.0.2.RELEASE"
compile "org.springframework.boot:spring-boot-starter-web:2.0.2.RELEASE"
compile "org.springframework.boot:spring-boot-starter-security:2.0.2.RELEASE"
compile "org.springframework.boot:spring-boot-starter-data-jpa:2.0.2.RELEASE"
compile "mysql:mysql-connector-java:5.1.46"
compileOnly "org.springframework.boot:spring-boot-devtools:2.0.2.RELEASE"
compile 'org.springframework.data:spring-data-rest-webmvc:3.0.7.RELEASE'
compile 'com.querydsl:querydsl-jpa:4.1.4'
compile 'com.querydsl:querydsl-apt:4.1.4:jpa'
testCompile("junit:junit")
testCompile("org.springframework.boot:spring-boot-starter-test")
testCompile("org.springframework.security:spring-security-test")
}
Thank you for any hints, how to solve this!
/edit:
I think I see a possible workaround now. What I could do is just to fetch the id's of all time slots and then use the repository to fetch the actual objects with their data (also their task objects).
But that feels definitely not like the optimal solution...
This is the NestedRowMapper I use:
import org.springframework.beans.*;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.support.JdbcUtils;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
public class NestedRowMapper<T> implements RowMapper<T> {
private Class<T> mappedClass;
public static <T> NestedRowMapper<T> get(Class<T> mappedClass) {
return new NestedRowMapper<>(mappedClass);
}
public NestedRowMapper(Class<T> mappedClass) {
this.mappedClass = mappedClass;
}
#Override
public T mapRow(ResultSet rs, int rowNum) throws SQLException {
try {
T mappedObject = this.mappedClass.newInstance();;
BeanWrapper bw = PropertyAccessorFactory.forBeanPropertyAccess(mappedObject);
bw.setAutoGrowNestedPaths(true);
ResultSetMetaData meta_data = rs.getMetaData();
int columnCount = meta_data.getColumnCount();
for (int index = 1; index <= columnCount; index++) {
try {
String column = JdbcUtils.lookupColumnName(meta_data, index);
Object value = JdbcUtils.getResultSetValue(rs, index, Class.forName(meta_data
.getColumnClassName(index)));
bw.setPropertyValue(column, value);
} catch (TypeMismatchException | NotWritablePropertyException
| ClassNotFoundException e) {
e.printStackTrace();
}
}
return mappedObject;
} catch (InstantiationException | IllegalAccessException e1) {
throw new RuntimeException(e1);
}
}
}
You're on the right lines that you can define a RowMapper that tells your app what type of object each column needs to be mapped to. I would recommend trying to use JdbcTemplate.query method: https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/jdbc/core/JdbcTemplate.html#query-java.lang.String-java.lang.Object:A-org.springframework.jdbc.core.RowMapper-
You will need to define a RowMapper (not necessarily a NestedRowMapper, you could try ParameterizedRowMapper), then pass that into query with your SQL and WHERE conditions mapped as args.
I think the bast way to use BeanPropertyRowMapper.newInstance(TimeSlot.class) in your getOverlappingEntries method
try this on NestedRowMapper.mapRow
if (value instanceof Timestamp) value = ((Timestamp) value).toLocalDateTime();

Cannot use Jedis when in Pipeline. Please use Pipeline or reset jedis state

I have trouble executing pipeline commands in spring data redis. I am using StringRedisTemplate. spring-data-redis 1.6.1, spring boot 1.3.2, and jedis both 2.7.3 and 2.8.0.
The code:
public void saveUserActivityEvents(Event... events) {
List<Object> results = stringRedisTemplate.executePipelined(
new RedisCallback<Object>() {
public Object doInRedis(RedisConnection connection) throws DataAccessException {
StringRedisConnection stringRedisConn = (StringRedisConnection)connection;
for(int i=0; i< events.length; i++) {
Event event = events[i];
String userId = getUserId(event.getUser());
String eventType = event.getEventType();
String itemId = event.getItem();
Integer amount = event.getAmount() == null ? 0 : Integer.parseInt(event.getAmount());
Double timestamp = Double.valueOf(event.getTimestamp());
Map<String, String> valueMap= new HashMap<String, String>();
valueMap.put("itemId", itemId);
valueMap.put("userId", userId);
String userItemEventsKey = StrSubstitutor.replace(Constants.KEY_USER_ITEM_EVENTS, valueMap);
valueMap.put("userId", userId);
String userItemsKey = StrSubstitutor.replace(Constants.KEY_USER_ITEMS, valueMap);
stringRedisConn.zAdd(userItemsKey, timestamp, itemId);
stringRedisConn.hIncrBy(userItemEventsKey, eventType, amount);
long expireInMs = TimeoutUtils.toMillis(getExpiryTimeInDays(event.getUser()), TimeUnit.DAYS);
stringRedisConn.pExpire(userItemEventsKey, expireInMs);
}
return null;
}
});
}
It blows with the exception in subject when executing pExpire.
I've tried with different flavour suggested in reference guide:
with
execute(redisCallback, true, true)
The same result. Any idea?
Thanks

Spring, autowire #Value from a database

I am using a properties File to store some configuration properties, that are accessed this way:
#Value("#{configuration.path_file}")
private String pathFile;
Is it possible (with Spring 3) to use the same #Value annotation, but loading the properties from a database instead of a file ?
Assuming you have a table in your database stored key/value pairs:
Define a new bean "applicationProperties" - psuedo-code follows...
public class ApplicationProperties {
#AutoWired
private DataSource datasource;
public getPropertyValue(String key) {
// transact on your datasource here to fetch value for key
// SNIPPED
}
}
Inject this bean where required in your application. If you already have a dao/service layer then you would just make use of that.
Yes, you can keep your #Value annotation, and use the database source with the help of EnvironmentPostProcessor.
As of Spring Boot 1.3, we're able to use the EnvironmentPostProcessor to customize the application's Environment before application context is refreshed.
For example, create a class which implements EnvironmentPostProcessor:
public class ReadDbPropertiesPostProcessor implements EnvironmentPostProcessor {
private static final String PROPERTY_SOURCE_NAME = "databaseProperties";
private String[] CONFIGS = {
"app.version"
// list your properties here
};
#Override
public void postProcessEnvironment(ConfigurableEnvironment environment, SpringApplication application) {
Map<String, Object> propertySource = new HashMap<>();
try {
// the following db connections properties must be defined in application.properties
DataSource ds = DataSourceBuilder
.create()
.username(environment.getProperty("spring.datasource.username"))
.password(environment.getProperty("spring.datasource.password"))
.url(environment.getProperty("spring.datasource.url"))
.driverClassName("com.mysql.jdbc.Driver")
.build();
try (Connection connection = ds.getConnection();
// suppose you have a config table storing the properties name/value pair
PreparedStatement preparedStatement = connection.prepareStatement("SELECT value FROM config WHERE name = ?")) {
for (int i = 0; i < CONFIGS.length; i++) {
String configName = CONFIGS[i];
preparedStatement.setString(1, configName);
ResultSet rs = preparedStatement.executeQuery();
while (rs.next()) {
propertySource.put(configName, rs.getString("value"));
}
// rs.close();
preparedStatement.clearParameters();
}
}
environment.getPropertySources().addFirst(new MapPropertySource(PROPERTY_SOURCE_NAME, propertySource));
} catch (Throwable e) {
throw new RuntimeException(e);
}
}
}
Finally, don't forget to put your spring.factories in META-INF. An example:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=
com.baeldung.environmentpostprocessor.autoconfig.PriceCalculationAutoConfig
Although not having used spring 3, I'd assume you can, if you make a bean that reads the properties from the database and exposes them with getters.

Resources