Oracle DB queries through JPA/Hibernate from java.util.Date - spring

I am currently working on a migration of a Spring Boot application from MariaDB to OracleDB. The Spring/Java backend uses Hibernate/JPA to generate queries for the MariaDB database, and as such in theory the migration should be fairly painless. Change a dialect and you're done. In practice, it turns out that the hibernate dialect for OracleDB 12C makes some odd assumptions when it comes to binding types to database types. The backend still uses the old java.util.Date type for all of its dates, which Hibernate seems to want desperately to cast to either a Long (even more outdated as far as I could find) or a BLOB type of some sort. BLOBs are great of course, but it seems much more intuitive to map a Date to a DATE.
Because the row is currently set to expect a DATE, I get the following error whenever I try to access the row:
InvalidDataAccessResourceUsageException: could not extract ResultSet
ORA-00932: inconsistent datatypes: expected - got BLOB
I have tried using the JPA Converter feature to manually cast these Date objects to something Hibernate wouldn't mess up, but this resulted in Hibernate expecting a VARBINARY as this article describes:
https://dzone.com/articles/leaky-abstractions-or-how-bind
#Converter(autoApply = false)
public class DateDATEAttributeConverter implements AttributeConverter<Date, DATE> {
#Override
public DATE convertToDatabaseColumn(Date date){
return new DATE(); // conversion to be done later
}
#Override
public Date convertToEntityAttribute(DATE date) {
return new Date(); // conversion to be done later
}
}
Using this very minimal converter, and running through the code step-by-step with a debugger, shows that everything seems to be properly attached to the preparedstatement, but is then refused by Hibernate with an
org.hibernate.exception.SQLGrammarException: could not extract ResultSet
error.
Afterwards I decided to try making a customer UserType like in the above article, and described in further detail here:
https://www.baeldung.com/hibernate-custom-types
I currently cast the java.util.Date to an Oracle DATE through the use of this custom type. The #Type annotation is used to make sure the relevant field is converted using this CustomType implementation. Sadly, after implementing all this, the same error as before returns. It seems somewhere beneath the hood there is still a conversion/binding going on that I haven't managed to influence.
#Override
public Object nullSafeGet(
ResultSet rs,
String[] names,
SessionImplementor session,
Object owner
)
throws SQLException {
LOGGER.debug("nullSafeGet: " + names[0]);
return rs.getTimestamp(names[0]);
}
#Override
public void nullSafeSet(
PreparedStatement st,
Object value,
int index,
SessionImplementor session
)
throws SQLException {
if(Objects.isNull(value)){
st.setNull(index, Types.DATE);
}else{
Date dateUtil = (Date)value;
LocalDate localDate = dateUtil.toInstant().atZone(ZoneId.systemDefault()).toLocalDate();
java.sql.Date dateSQL = java.sql.Date.valueOf(localDate);
DATE date = new DATE(dateSQL);
LOGGER.debug("nullSafeSet: " + date);
st.setObject(index, date);
}
}
Is there any established method to get around this? I have searched around online for quite a bit, but I didn't get much further than these two articles, or being told to stop using old types such as Date. Sadly with big old projects and new deadlines that is not a preferable option.

Related

Fetching One to Many associations with JooQ

I'm trying to deserialise a one to many association with JooQ (without code generation) as per this post.
Here are my target classes.
public class Author {
private Long id;
private String name;
private List<Book> books;
}
public class Book {
private String name;
}
My JooQ query is as follows:
dslContext
.select(table("authors").asterisk(),
field(
select(jsonArrayAgg(
jsonObject(
jsonEntry("name", field("books.name")))))
.from(table("books"))
.join(table("authors"))
.on(field("books.author_id").eq(field("authors.id")))
.where(field("emails.collection_case_id")
.eq(field("collection_cases.id")))
).as("books"))
.from(table("authors"))
.where(trueCondition())
.fetchInto(Author.class);
The jsonObject() method does not work as expected for me. The generated SQL statement looks something like this:
select authors.*, (select json_agg(json_build_object(?, books.name)) from books join authors ...
The translated postgres query has not properly replaced the key attribute of json_build_object and this results in SQL exception.
PS: I'm using JooQ 3.14.0 with postgres 11.5
While I can't reproduce this issue on my side with various PostgreSQL server and JDBC driver versions, the simple workaround here is to use DSL.inline(String) to prevent jOOQ's generating a bind variable for the json_build_object() function argument:
jsonEntry(inline("name"), field("books.name"))

How to query more than one columns but not all columns with #Query but still use the Domain Data Model to map with Spring Data JDBC?

My Data model is
#Getter
#Setter
public class Customer {
#Id private ID id;
#CreatedDate protected Instant createdAt;
#LastModifiedDate protected Instant updatedAt;
#CreatedBy protected String createdBy;
#LastModifiedBy protected String updatedBy;
#Version protected Long version;
private UUID orderId;
private String offer;
}
My Repository is
public interface CustomerRepository extends CrudRepository<Customer, UUID> {
#Query(
"SELECT ID, Offer FROM Customer WHERE orderId = :orderId ")
List<Customer> findCustomerByOrderId(
#Param("orderId") UUID orderId);
}
This will result in an exception saying 'orderId column not found [42122-190]'. So Spring expects you to always query all the columns. I understand that with JPA we have a strong mapping between the Entities and the Data Schema. But the whole point of spring data JDBC is avoiding the tight coupling between POJO's data model and database schema. Why not the EntityRowMapper is just mapping NULL to the properties which are not part of the query?
Is there a way to tell the RowMapper used, to ignore properties which are not part of the query? Creating separate RowMapper for these simple queries seems a lot of unnecessary work.
I still can work around this by changing the query like
#Query(
"SELECT ID, Offer, OrderId, null as CreatedAt, null as CreatedBy, null as UpdatedAt, null as UpdatedBy, null as Version FROM Customer WHERE orderId = :orderId ")
But this will still serialize the entire object with null values. Am I missing something obvious here?
Note This is not Spring Data JPA. Its Spring Data JDBC.
Edit
Looking more into it, the exception is from h2 database lib.
Caused by: org.h2.jdbc.JdbcSQLException: Column "orderid" not found [42122-190]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:179)
at org.h2.message.DbException.get(DbException.java:155)
at org.h2.jdbc.JdbcResultSet.getColumnIndex(JdbcResultSet.java:3129)
at org.h2.jdbc.JdbcResultSet.get(JdbcResultSet.java:3217)
at org.h2.jdbc.JdbcResultSet.getObject(JdbcResultSet.java:522)
at com.zaxxer.hikari.pool.HikariProxyResultSet.getObject(HikariProxyResultSet.java)
at org.springframework.data.jdbc.core.EntityRowMapper.readFrom(EntityRowMapper.java:127)
You can't at least right now.
There are three solutions to this, two of which you already pointed out:
extend your select statement with , NULL as <column-name> for all the missing columns.
I'm not sure if
But this will still serialize the entire object with null values.
means that this isn't working for you in some way.
specify a RowMapper.
You could use a class containing exactly the fields returned by the query. It could even have getters for the other columns if you want an interface implemented by both your normal entity and the partial entity.
You write:
But the whole point of spring data JDBC is to avoid the tight coupling between pojo's data model and database schema.
This is not quite right.
An important goal of Spring Data JDBC is to not have a run time connection between entities and table rows.
This would require proxies or similar and brings a lot of complexity.
But the structural mapping between entities and table is probably going to be stronger (and certainly is right now) since all the variants of mappings available in JPA bring complexity.
And the main goal in Spring Data JDBC is to be conceptually simpler than JPA.
You also ask
Why not the EntityRowMapper is just mapping NULL to the properties which are not part of the query?
I'm not sure if I actively thought about it when I coded it but I don't like the idea of defaulting to NULL because this would make it easy to accidentally not load a column because you have a typo in an alias.
But I'm not against alternative solutions.
If you have an idea please create a feature request.

Spring JdbcTemplate and NamedParameterJdbcTemplate

Is it advisable to use JDBCTemplate and NamedParameterJdbcTemplate together with an idea that NamedParameterJdbcTemplate is used for inserting/updating while JdbcTemplate takes care of retrieving and deleting? Because I can insert objects by using NamedParameterJdbcTemplate as simple as shown below:
public long save(Domain obj) {
String sql = "insert into domain(name,password,salt,dnspod_domain_id,status)" +
" values(:name,:password,:salt,:dnspodDomainId,:status)";
KeyHolder keyHolder = new GeneratedKeyHolder();
namedJdbc.update(sql, new BeanPropertySqlParameterSource(obj), keyHolder);
return keyHolder.getKey().longValue();
}
If I want to insert objects/data into table by using JDBCTemplate, I will have to write lot of code manually assigning parameters with PreparedStatement...
When it comes to retrieving, I can do it by JDBCTemplate as shown below:
List<User> users = jdbcTemplate.query("SELECT * FROM user", BeanPropertyRowMapper.newInstance(User.class));
No need to use ResultSet along with RowMapper to retrieve rows.
My concern is that if there are any performance issues using JDBCTemplate and NamedParameterJdbcTemplate together.
You can use both JdbcTemplate and NamedParameterJdbcTemplate whenever it is needed. JdbcTemplate is slightly error-prone, since the order of "?" placeholders present in query and order of parameters you are passing through either array or direct setting is matter.
Where as NamedParameterJdbcTemplateallows you to assign names to parameters and map values to the parameters by name, does't matter which order you set the values.
As per NamedParameterJdbcTemplate api doc,
This class delegates to a wrapped JdbcTemplate once the substitution from named parameters to JDBC style '?' placeholders is done at execution time.
So internally api takes some additional time to convert Named params to `?' place holders, but this can be ignored.
My suggestion is if your query has too many parameters go with NamedParameterJdbcTemplate, since its safe and error free else go with JdbcTemplate.

JPA 2.0 and Oracle with TemporalType.TIME

I'm Using Oracle 11g and JPA 2.0 (hibernate in JBoss 6.0.0).
I need to represent a time range in an entity, so I defined those fields:
#Temporal(TemporalType.TIME)
private Date startTime;
#Temporal(TemporalType.TIME)
private Date endTime;
The generated tables use two DATE fields, and this is ok since Oracle doesn't have a type representing just the time part.
When loading the entity from db, just the time part is loaded (the field contains a java.sql.Time).
I've seen instead that if I set a complete date+time in the fields, the date part will be persisted to the db.
Is there a way to ensure that the date part will not be persisted to the db?
You can write setter methods which remove the date component. Quick and dirty example:
public void setStartTime(Date startTime)
{
this.startTime = new Time(startTime.getTime() % 86400000L);
}
Though you'd be better off using Joda Time to do your date/time calculations (see this question). I didn't test this to make sure it's correct, but it should show you the basic idea:
public void setStartTime(Date startTime)
{
this.startTime = new Time(LocalTime.fromDateFields(startTime).millisOfDay().get());
}

Spring JdbcTemplate returns empty result when there should be a valid result

I'm using SimpleJdbcDaoSupport object to access DB resources. I have a query which is frequently executed against the database to locate a record with a specific key. for some reason after executing the same query several times I start to get an empty result even though the record exists in the database.
Any ideas what can cause this behavior?
daoSupport.getJdbcTemplate().query(this.getConsumerTokenQueryStatement(),params, this.rowMapper);
public static class TokenServicesRowMapper implements RowMapper {
public Object mapRow(ResultSet rs, int rowNum) throws SQLException {
DefaultLobHandler lobHandler = new DefaultLobHandler();
return lobHandler.getBlobAsBytes(rs, 1);
}
}
If this is not related to your code one reason can be the fact that another transaction is doing something (like an update) to the row you search and due do the isolation between transactions you cannot see your row. One transaction can change but not commit your row yet while in the same time the other one is searching for it but as it can only see committed rows it does not see your row.

Resources