How to ignore some columns during inserting date - android-room

When you use Android Room auto generated insert method, it pass all values to all columns For example, if you have an entity like below:
#Entity
public class Task {
#PrimaryKey(autoGenerate = true)
public long id;
#NonNull
#ColumnInfo(defaultValue = "Unknown Title")
public String name;
#NonNull
#ColumnInfo(defaultValue = "Does not have any description!")
public String desc;
#ColumnInfo(defaultValue = "false")
public boolean isDone;
}
And you try to insert an empty instance of Task class by:
taskDao.insert(new Task());
It will run a query like this:
INSERT INTO Task (id, name, desc, isDone) values (null, null, null, 0);
that is against of our table structure rules, so we get error:
Caused by: android.database.sqlite.SQLiteConstraintException: NOT NULL constraint failed: task.name (code 1299 SQLITE_CONSTRAINT_NOTNULL)
While if it use this query:
INSERT INTO Task DEFAULT VALUES;
or
INSERT INTO Task (id) VALUES (null);
SQLite creates a row with all default values, without any errors.
But to ignore null variables in the insert query for using default valuse?

Try this:
#Entity
public class Task {
public String mName = "Unknown Title";
public String mDescription = "Does not have any description!";
public boolean mIsDone = "false";
#PrimaryKey(autoGenerate = true) //must be placed at the end!!
public long id = 0;
}
And create object like this:
Task task = new Task() //all use default values
Task task = new Task(name) //use default values for other fileds
Task task = new Task(name, description) //the order matters, otherwise you need to explicitlty specify the filed, like following
Task task = new Task(mIsDone = true, mName = "Sam")
Room "deaultValue" is not for entity use:
https://developer.android.com/reference/androidx/room/ColumnInfo#defaultValue()

Based on what is written in the Android documentation:
The default value you specify here will NOT be used if you simply insert the Entity with #Insert.
That means there are no simple way but you can do it in some ways:
Assign Variables Values
Assign what you defined as default values in #ColumnInfo, for those variables value, because if you doesn't update those values, the default assignment will be used, as Documentation says:
If you simply insert the Entity with #Insert ... any value assigned in Java/Kotlin will be used.
#Entity
public class Task {
#PrimaryKey(autoGenerate = true)
public long id;
#NonNull
#ColumnInfo(defaultValue = "Unknown Title")
public String name = "Unknown Title";
#NonNull
#ColumnInfo(defaultValue = "Does not have any description!")
public String description = "Does not have any description!";
#ColumnInfo(defaultValue = "false")
public boolean isDone; // it doesn't need assignment
// boolean values are false as default
}
Special Query
As Documentation recommend:
Use #Query with an INSERT statement and skip this column there in order to use this default value.
#Query("INSERT INTO task DEFAULT VALUES")
long insert();
or
#Query("INSERT INTO task (id) VALUES (null)")
long insert();
As you told in your question.
Simplified Class (ref)
#Insert(entity = Task.class)
long insert(NewTask task);
#Entity
class NewTask {
int id = 0;
}
it runs
INSERT INTO task (id) VALUES (null);

Related

QueryDSL Predicate for use with JPARepository where field is a JSON String converted using an AttributeConverter to a List<Object>

I have a JPA Entity (Terminal) which uses an AttributeConverter to convert a Database String into a list of Objects (ProgrmRegistration). The converter just uses a JSON ObjectMapper to turn the JSON String into POJO objects.
Entity Object
#Entity
#Data
public class Terminal {
#Id
private String terminalId;
#NotEmpty
#Convert(converter = ProgramRegistrationConverter.class)
private List<ProgramRegistration> programRegistrations;
#Data
public static class ProgramRegistration {
private String program;
private boolean online;
}
}
The Terminal uses the following JPA AttributeConverter to serialize the Objects from and to JSON
JPA AttributeConverter
public class ProgramRegistrationConverter implements AttributeConverter<List<Terminal.ProgramRegistration>, String> {
private final ObjectMapper objectMapper;
private final CollectionType programRegistrationCollectionType;
public ProgramRegistrationConverter() {
this.objectMapper = new ObjectMapper().setSerializationInclusion(JsonInclude.Include.NON_EMPTY);
this.programRegistrationCollectionType =
objectMapper.getTypeFactory().constructCollectionType(List.class, Terminal.ProgramRegistration.class);
}
#Override
public String convertToDatabaseColumn(List<Terminal.ProgramRegistration> attribute) {
if (attribute == null) {
return null;
}
String json = null;
try {
json = objectMapper.writeValueAsString(attribute);
} catch (final JsonProcessingException e) {
LOG.error("JSON writing error", e);
}
return json;
}
#Override
public List<Terminal.ProgramRegistration> convertToEntityAttribute(String dbData) {
if (dbData == null) {
return Collections.emptyList();
}
List<Terminal.ProgramRegistration> list = null;
try {
list = objectMapper.readValue(dbData, programRegistrationCollectionType);
} catch (final IOException e) {
LOG.error("JSON reading error", e);
}
return list;
}
}
I am using Spring Boot and a JPARepository to fetch a Page of Terminal results from the Database.
To filter the results I am using a BooleanExpression as the Predicate. For all the filter values on the Entity it works well, but the List of objects converted from the JSON string does not allow me to easily write an Expression that will filter the Objects in the list.
REST API that is trying to filter the Entity Objects using QueryDSL
#GetMapping(path = "/filtered/page", produces = MediaType.APPLICATION_JSON_VALUE)
public Page<Terminal> findFilteredWithPage(
#RequestParam(required = false) String terminalId,
#RequestParam(required = false) String programName,
#PageableDefault(size = 20) #SortDefault.SortDefaults({ #SortDefault(sort = "terminalId") }) Pageable p) {
BooleanBuilder builder = new BooleanBuilder();
if (StringUtils.isNotBlank(terminalId))
builder.and(QTerminal.terminal.terminalId.upper()
.contains(StringUtils.upperCase(terminalId)));
// TODO: Figure out how to use QueryDsl to get the converted List as a predicate
// The code below to find the programRegistrations does not allow a call to any(),
// expects a CollectionExpression or a SubqueryExpression for calls to eqAny() or in()
if (StringUtils.isNotBlank(program))
builder.and(QTerminal.terminal.programRegistrations.any().name()
.contains(StringUtils.upperCase(programName)));
return terminalRepository.findAll(builder.getValue(), p);
}
I am wanting to get any Terminals that have a ProgramRegistration object with the program name equal to the parameter passed into the REST service.
I have been trying to get CollectionExpression or SubQueryExpression working without success since they all seem to be wanting to perform a join between two Entity objects. I do not know how to create the path and query so that it can iterate over the programRegistrations checking the "program" field for a match. I do not have a QProgamRegistration object to join with, since it is just a list of POJOs.
How can I get the predicate to match only the Terminals that have programs with the name I am searching for?
This is the line that is not working:
builder.and(QTerminal.terminal.programRegistrations.any().name()
.contains(StringUtils.upperCase(programName)));
AttributeConverters have issues in Querydsl, because they have issues in JPQL - the query language of JPA - itself. It is unclear what actually the underlying query type of the attribute is, and whether the parameter should be a basic type of that query type, or should be converted using the conversion. Such conversion, whilst it appears logical, is not defined in the JPA specification. Thus a basic type of the query type needs to be used instead, which leads to new difficulties, because Querydsl can't know the type it needs to be. It only knows the Java type of the attribute.
A workaround can be to force the field to result into a StringPath by annotating the field with #QueryType(PropertyType.STRING). Whilst this fixes the issue for some queries, you will run into different issues in other scenarios. For more information, see this thread.
Although the following desired QueryDsl looks like it should work
QTerminal.terminal.programRegistrations.any().name().contains(programName);
In reality JPA would never be able to convert it into something that would make sense in terms of SQL. The only SQL that JPA could convert it into could be as follows:
SELECT t.terminal_id FROM terminal t where t.terminal_id LIKE '%00%' and t.program_registrations like '%"program":"MY_PROGRAM_NAME"%';
This would work in this use case, but be semantically wrong, and therefore it is correct that it should not work. Trying to select unstructured data using a structured query language makes no sense
The only solution is to treat the data as characters for the DB search criteria, and to treat it as a list of Objects after the query completes and then perform filtering of the rows in Java. Although This makes the paging feature rather useless.
One possible solution is to have a secondary read only String version of the column that is used for the DB search criteria, that is not converted to JSON by the AttributeConverter.
#JsonIgnore
#Column(name = "programRegistrations", insertable = false, updatable = false)
private String programRegistrationsStr;
The real solution is do not use unstructured data when you want structured queries on that data Therefore convert the data to either a database that supports the JSON natively for queries or model the data correctly in DDL.
To have a short answer: the parameter used in the predicate on attribute with #QueryType must be used in another predicate on attribute of type String.
It's a clearly known issue describe in this thread: https://github.com/querydsl/querydsl/issues/2652
I simply want to share my experience about this bug.
Model
I have an entity like
#Entity
public class JobLog {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private String id;
#QueryType(PropertyType.STRING)
private LocalizedString message;
}
Issue
I want to perform some predicate about message. Unfortunately, with this configuration, I can't do this:
predicates.and(jobLog.message.likeIgnoreCase(escapedTextFilter));
because I have the same issues that all people!
Solution
But I find a way to workaround :)
predicates.and(
(jobLog.id.likeIgnoreCase(escapedTextFilter).and(jobLog.id.isNull()))
.or(jobLog.message.likeIgnoreCase(escapedTextFilter)));
Why it workaround the bug?
It's important that escapedTextFilter is the same in both predicate!
Indeed, in this case, the constant is converter to SQL in the first predicate (which is of String type). And in the second predicate, we use the conterted value
Bad thing?
Add a performance overflow because we have OR in predicate
Hope this can help someone :)
I've found one way to solve this problem, my main idea is to use mysql function cast(xx as char) to cheat hibrenate. Below is my base info. My code is for work , so I've made an example.
// StudentRepo.java
public interface StudentRepo<Student, Long> extends JpaRepository<Student, Long>, QuerydslPredicateExecutor<Student>, JpaSpecificationExecutor<Student> {
}
// Student.java
#Data
#AllArgsConstructor
#NoArgsConstructor
#EqualsAndHashCode(of = "id")
#Entity
#Builder
#Table(name = "student")
public class Student {
#Convert(converter = ClassIdsConvert.class)
private List<String> classIds;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
}
// ClassIdsConvert.java
public class ClassIdsConvert implements AttributeConverter<List<String>, String> {
#Override
public String convertToDatabaseColumn(List<String> ips) {
// classid23,classid24,classid25
return String.join(",", ips);
}
#Override
public List<String> convertToEntityAttribute(String dbData) {
if (StringUtils.isEmpty(dbData)) {
return null;
} else {
return Stream.of(dbData.split(",")).collect(Collectors.toList());
}
}
}
my db is below
id
classIds
name
address
1
2,3,4,11
join
北京市
2
2,31,14,11
hell
福建省
3
2,12,22,33
work
福建省
4
1,4,5,6
ouy
广东省
5
11,31,34,22
yup
上海市
-- ----------------------------
-- Table structure for student
-- ----------------------------
DROP TABLE IF EXISTS `student`;
CREATE TABLE `student` (
`id` int(11) NOT NULL,
`classIds` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL,
`name` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL,
`address` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL,
PRIMARY KEY (`id`) USING BTREE
) ENGINE = InnoDB CHARACTER SET = utf8mb4 COLLATE = utf8mb4_general_ci ROW_FORMAT = Dynamic;
SET FOREIGN_KEY_CHECKS = 1;
Use JpaSpecificationExecutor solve the problem
Specification<Student> specification = (root, query, criteriaBuilder) -> {
String classId = "classid24"
String classIdStr = StringUtils.wrap(classId, "%");
var predicate = criteriaBuilder.like(root.get("classIds").as(String.class), classIdStr);
return criteriaBuilder.or(predicate);
};
var students = studentRepo.findAll(specification);
log.info(new Gson().toJson(students))
attention the code root.get("classIds").as(String.class)
In my opinion, if I don't add .as(String.class) , hibernate will think the type of student.classIds is list and throw an Exception as below.
SQL will like below which runs correctly in mysql. But hibnerate can't work.
org.springframework.dao.InvalidDataAccessApiUsageException: Parameter value [%classid24%] did not match expected type [java.util.List (n/a)]; nested exception is java.lang.IllegalArgumentException: Parameter value [%classid24%] did not match expected type [java.util.List (n/a)]
SELECT
student0_.id AS id1_0_,
student0_.class_ids AS class_ids2_0_
FROM
student student0_
WHERE
student0_.class_ids LIKE '%classid24%' ESCAPE '!'
if you add .as(String.class) , hibnerate will think the type of student.classIds as string and won't check it at all.
SQL will be like below which can run correct in mysql. Also in JPA.
SELECT
student0_.id AS id1_0_,
student0_.class_ids AS class_ids2_0_
FROM
student student0_
WHERE
cast( student0_.class_ids AS CHAR ) LIKE '%classid24%' ESCAPE '!'
when the problem is solved by JpaSpecificationExecutor, so I think this can be solve also in querydsl. At last I find the template idea in querydsl.
String classId = "classid24";
StringTemplate st = Expressions.stringTemplate("cast({0} as string)", qStudent.classIds);
var students = Lists.newArrayList<studentRepo.findAll(st.like(StringUtils.wrap(classId, "%"))));
log.info(new Gson().toJson(students));
it's sql is like below.
SELECT
student0_.id AS id1_0_,
student0_.class_ids AS class_ids2_0_
FROM
student student0_
WHERE
cast( student0_.class_ids AS CHAR ) LIKE '%classid24%' ESCAPE '!'

JPA CriteriaQuery Unable to locate appropriate constructor for sum of date diffrerence

I know this question look similar, but I think for me the case is different
this is the exception message
Exception while fetching data (/periodicReport/userReport) :
org.hibernate.hql.internal.ast.QuerySyntaxException: Unable to locate
appropriate constructor on class
[com.analytics.entity.projections.UserParticipantCountAndDurationImpl].
Expected arguments are: java.util.UUID, long, java.util.Date [select
new
com.analytics.entity.projections.UserParticipantCountAndDurationImpl(generatedAlias0.userID,
count(distinct generatedAlias0.userID), sum(function('TIMEDIFF',
generatedAlias0.endTime, generatedAlias0.startTime))) from
com.analytics.entity.ConferenceParticipant as
generatedAlias0 group by generatedAlias0.userID order by :param0 desc]
I'm expecting the sum to be Long and I have specified it in the criteria query.
I don't have any idea where Date is defined
the snipet for the buider
Expression<Long> sum = criteriaBuilder.sum(
criteriaBuilder.function(
"TIMEDIFF",
Long.class,
participantRoot.<Date>get("endTime"),
participantRoot.<Date>get("startTime")
)
);
Expression<Long> count = criteriaBuilder.countDistinct(participantRoot.get("userID"));
reportQuery.select(criteriaBuilder.construct(
UserParticipantCountAndDurationImpl.class,
participantRoot.get("userID").as(UUID.class).alias(USER_ID),
count.as(Long.class).alias(PARTICIPANTS),
sum.as(Long.class).alias(PARTICIPANT_DURATION)
));
the class in question
#Data
#NoArgsConstructor
public class UserParticipantCountAndDurationImpl implements UserParticipantCountAndDuration, Serializable {
private UUID userID;
private long participants;
public long participantDuration;
public UserParticipantCountAndDurationImpl(
UUID userID,
long participants,
long participantDuration
) {
this.userID = userID;
this.participants = participants;
this.participantDuration = participantDuration;
}
}
And yes I have tried to change signature of the constructor. in that case the query will run, but then thows CastException since it java.util.Date
you need a Expression with the name of the unit you want to extract from the timediff function
public static class TimeUnitExpression extends BasicFunctionExpression<String> implements Serializable {
public TimeUnitExpression(CriteriaBuilderImpl criteriaBuilder, Class<String> javaType,
String functionName) {
super(criteriaBuilder, javaType, functionName);
}
#Override
public String render(RenderingContext renderingContext) {
return getFunctionName();
}
}
so your sum expression becomes
Expression<Long> sum = criteriaBuilder.sum(
criteriaBuilder.function(
"TIMEDIFF",
Long.class,
new TimeUnitExpression(null, String.class, "MILLISECOND"),
participantRoot.<Date>get("endTime"),
participantRoot.<Date>get("startTime")
)
);
TIMEDIFF return date time as the result, so it cant be casted to Long or double, the the Sql object in the raw result is DateTime, refer this.
I can solve this by using a function to get minutes from the datetime. but I solved this by getting difference of UNIXTIMESTAMP for both the datetime for the summation. Since UNIXTTIMESTAMP is always long I can cast it to Long or BigInteger

How do i check if a record already exists in table in springboot JPA?

I have a table with 4 fields. And if i inserted a record that already exists i.e all field value matches with previous record in table. How do i return record only but not insert into database ?
My model look like this:
#Entity
public class QuestionDetails {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private int id;
private String department;
private String year;
private String academic_year;
private String semester;
private String type;
private String subject;
private int points;
private int unit;
// getter, setter
And Controller look this:
#Autowired
public QuestionDetailsRepository qdRepository;
#PostMapping("/api/questionDetails")
public QuestionDetails addQuestion(#Valid #RequestBody QuestionDetails qDetails) {
// here i want to first check if qDetails object is already present in table .
If present i want to return that existed record instead of inserting into table.
QuestionDetails qd = qdRepository.save(qDetails); // save only new record
return qd;
}
Using postman i send data like this:
{
"department" : "IT",
"year" : "2020",
"academic_year" : "1st year",
"semester" : "first semester",
"type" : "objective",
"subject" : "JAVA",
"points" : 10,
"unit" : 5
}
Here, i am sending data that is already present in table. So, i want to check if this record already exist? If doesn't exist insert into table otherwise return that existed record.
How do i achieve that using springboot Jpa hibernate?
Implement a select method in QuestionDetailsRepository as below. Add all the criteria which make a record unique. I am using department and year but you can use all the parameters of the QuestionDetails entity.
#Query("select qd from QuestionDetails qd where qd.department = :#{#req. department} and qd.year = :#{#req.year}")
Optional<QuestionDetails> findQuestionDetails(#Param("req") QuestionDetails req);
Ensure to implement the equals() and hashCode() in QuestionDetails class as per the unique criteria.
Your pseudo-code would look like this:
Optinal<QuestionDetails> optRecord = qdRepository.findQuestionDetails(qDetails);
if(opt.isPresent()){
return opt.get();
}else{
qdRepository.save(qDetails);
}

Spring Data MongoDB: Accessing and updating sub documents

First experiments with Spring Data and MongoDB were great. Now I've got the following structure (simplified):
public class Letter {
#Id
private String id;
private List<Section> sections;
}
public class Section {
private String id;
private String content;
}
Loading and saving entire Letter objects/documents works like a charm. (I use ObjectId to generate unique IDs for the Section.id field.)
Letter letter1 = mongoTemplate.findById(id, Letter.class)
mongoTemplate.insert(letter2);
mongoTemplate.save(letter3);
As documents are big (200K) and sometimes only sub-parts are needed by the application: Is there a possibility to query for a sub-document (section), modify and save it?
I'd like to implement a method like
Section s = findLetterSection(letterId, sectionId);
s.setText("blubb");
replaceLetterSection(letterId, sectionId, s);
And of course methods like:
addLetterSection(letterId, s); // add after last section
insertLetterSection(letterId, sectionId, s); // insert before given section
deleteLetterSection(letterId, sectionId); // delete given section
I see that the last three methods are somewhat "strange", i.e. loading the entire document, modifying the collection and saving it again may be the better approach from an object-oriented point of view; but the first use case ("navigating" to a sub-document/sub-object and working in the scope of this object) seems natural.
I think MongoDB can update sub-documents, but can SpringData be used for object mapping? Thanks for any pointers.
I figured out the following approach for slicing and loading only one subobject. Does it seem ok? I am aware of problems with concurrent modifications.
Query query1 = Query.query(Criteria.where("_id").is(instance));
query1.fields().include("sections._id");
LetterInstance letter1 = mongoTemplate.findOne(query1, LetterInstance.class);
LetterSection emptySection = letter1.findSectionById(sectionId);
int index = letter1.getSections().indexOf(emptySection);
Query query2 = Query.query(Criteria.where("_id").is(instance));
query2.fields().include("sections").slice("sections", index, 1);
LetterInstance letter2 = mongoTemplate.findOne(query2, LetterInstance.class);
LetterSection section = letter2.getSections().get(0);
This is an alternative solution loading all sections, but omitting the other (large) fields.
Query query = Query.query(Criteria.where("_id").is(instance));
query.fields().include("sections");
LetterInstance letter = mongoTemplate.findOne(query, LetterInstance.class);
LetterSection section = letter.findSectionById(sectionId);
This is the code I use for storing only a single collection element:
MongoConverter converter = mongoTemplate.getConverter();
DBObject newSectionRec = (DBObject)converter.convertToMongoType(newSection);
Query query = Query.query(Criteria.where("_id").is(instance).and("sections._id").is(new ObjectId(newSection.getSectionId())));
Update update = new Update().set("sections.$", newSectionRec);
mongoTemplate.updateFirst(query, update, LetterInstance.class);
It is nice to see how Spring Data can be used with "partial results" from MongoDB.
Any comments highly appreciated!
I think Matthias Wuttke's answer is great, for anyone looking for a generic version of his answer see code below:
#Service
public class MongoUtils {
#Autowired
private MongoTemplate mongo;
public <D, N extends Domain> N findNestedDocument(Class<D> docClass, String collectionName, UUID outerId, UUID innerId,
Function<D, List<N>> collectionGetter) {
// get index of subdocument in array
Query query = new Query(Criteria.where("_id").is(outerId).and(collectionName + "._id").is(innerId));
query.fields().include(collectionName + "._id");
D obj = mongo.findOne(query, docClass);
if (obj == null) {
return null;
}
List<UUID> itemIds = collectionGetter.apply(obj).stream().map(N::getId).collect(Collectors.toList());
int index = itemIds.indexOf(innerId);
if (index == -1) {
return null;
}
// retrieve subdocument at index using slice operator
Query query2 = new Query(Criteria.where("_id").is(outerId).and(collectionName + "._id").is(innerId));
query2.fields().include(collectionName).slice(collectionName, index, 1);
D obj2 = mongo.findOne(query2, docClass);
if (obj2 == null) {
return null;
}
return collectionGetter.apply(obj2).get(0);
}
public void removeNestedDocument(UUID outerId, UUID innerId, String collectionName, Class<?> outerClass) {
Update update = new Update();
update.pull(collectionName, new Query(Criteria.where("_id").is(innerId)));
mongo.updateFirst(new Query(Criteria.where("_id").is(outerId)), update, outerClass);
}
}
This could for example be called using
mongoUtils.findNestedDocument(Shop.class, "items", shopId, itemId, Shop::getItems);
mongoUtils.removeNestedDocument(shopId, itemId, "items", Shop.class);
The Domain interface looks like this:
public interface Domain {
UUID getId();
}
Notice: If the nested document's constructor contains elements with primitive datatype, it is important for the nested document to have a default (empty) constructor, which may be protected, in order for the class to be instantiatable with null arguments.
Solution
Thats my solution for this problem:
The object should be updated
#Getter
#Setter
#Document(collection = "projectchild")
public class ProjectChild {
#Id
private String _id;
private String name;
private String code;
#Field("desc")
private String description;
private String startDate;
private String endDate;
#Field("cost")
private long estimatedCost;
private List<String> countryList;
private List<Task> tasks;
#Version
private Long version;
}
Coding the Solution
public Mono<ProjectChild> UpdateCritTemplChild(
String id, String idch, String ownername) {
Query query = new Query();
query.addCriteria(Criteria.where("_id")
.is(id)); // find the parent
query.addCriteria(Criteria.where("tasks._id")
.is(idch)); // find the child which will be changed
Update update = new Update();
update.set("tasks.$.ownername", ownername); // change the field inside the child that must be updated
return template
// findAndModify:
// Find/modify/get the "new object" from a single operation.
.findAndModify(
query, update,
new FindAndModifyOptions().returnNew(true), ProjectChild.class
)
;
}

Hibernate Criteria API Equivalent to Oracle's Decode

What would the equivalent of Oracle's DECODE() function be in the Hibernate Criteria API?
An SQL example of what I need to do:
SELECT DECODE(FIRST_NAME, NULL, LAST_NAME, FIRST_NAME) as NAME ORDER BY NAME;
Which returns LAST_NAME to NAME in the event that FIRST_NAME is NULL.
I would prefer to use the Criteria API but could use HQL if there's no other way.
Check out org.hibernate.criterion.Projections.sqlProjection(...).
Similar to this answer.
For the example you give, you could use COALESCE().
How to simulate NVL in HQL
You can use sqlRestriction to call the native decode function.
session.createCriteria(Table.class).add(Restrictions.sqlRestriction("decode({alias}.firstName,null,
{alias}.lastName,
{alias}.firstName)"))
With HQL, the Oracle dialect already has coalesce and nvl functions, or if you really need decode, you could subclass the dialect and add it as a custom function. I don't know if Hibernate supports a variable length number of arguments like decode does, but worst-case, you could create decode1, decode2, etc to support different numbers of arguments.
Or, if you aren't using the column in a where or group by, you could just bring both attributes back and do the check in Java.
Ended up adding a formula for it:
<property name="name" formula="coalesce(first_name, last_name)"/>
I'm concerned about cross-database problems and possibly efficiency problems with this approach so I'm willing to change the accepted answer.
You can Use Hibernate #Type attribute,Based on your requirement you can customize the annotation and apply on top of the fied. like :
public class PhoneNumberType implements UserType {
#Override
public int[] sqlTypes() {
return new int[]{Types.INTEGER, Types.INTEGER, Types.INTEGER};
}
#Override
public Class returnedClass() {
return PhoneNumber.class;
}
// other methods
}
First, the null SafeGet method:
#Override
public Object nullSafeGet(ResultSet rs, String[] names,
SharedSessionContractImplementor session, Object owner) throws HibernateException,
SQLException {
int countryCode = rs.getInt(names[0]);
if (rs.wasNull())
return null;
int cityCode = rs.getInt(names[1]);
int number = rs.getInt(names[2]);
PhoneNumber employeeNumber = new PhoneNumber(countryCode, cityCode, number);
return employeeNumber;
}
Next, the null SafeSet method:
#Override
public void nullSafeSet(PreparedStatement st, Object value,
int index, SharedSessionContractImplementor session)
throws HibernateException, SQLException {
if (Objects.isNull(value)) {
st.setNull(index, Types.INTEGER);
} else {
PhoneNumber employeeNumber = (PhoneNumber) value;
st.setInt(index,employeeNumber.getCountryCode());
st.setInt(index+1,employeeNumber.getCityCode());
st.setInt(index+2,employeeNumber.getNumber());
}
}
Finally, we can declare our custom PhoneNumberType in our OfficeEmployee entity class:
#Entity
#Table(name = "OfficeEmployee")
public class OfficeEmployee {
#Columns(columns = { #Column(name = "country_code"),
#Column(name = "city_code"), #Column(name = "number") })
#Type(type = "com.baeldung.hibernate.customtypes.PhoneNumberType")
private PhoneNumber employeeNumber;
// other fields and methods
}
This might solve your problem, This will work for all database. if you want more info refer :: https://www.baeldung.com/hibernate-custom-types
If you can use HQL the you can replace DECODE with CASE.
You can update your query from,
SELECT DECODE(FIRST_NAME, NULL, LAST_NAME, FIRST_NAME) as NAME ORDER BY NAME;
to,
SELECT CASE WHEN FIRST_NAME = NULL then LAST_NAME ELSE FIRST_NAME END as NAME ORDER BY NAME;

Resources