How to use arrays with Spring Data JDBC - spring-data-jdbc

I am trying to use Spring Data JDBC for my PostgreSQL database. I defined the following beans
#Data
class Report {
#Id
private Long id;
private String name;
private Set<Dimension> dimensions;
}
#Data
class Dimension {
private String name;
private Long[] filterIds;
}
and the corresponding DDL
CREATE TABLE report (
id bigserial PRIMARY KEY,
name text NOT NULL
);
CREATE TABLE dimension (
id bigserial PRIMARY KEY ,
report bigint,
name text,
filter_ids bigint[],
FOREIGN KEY (report) REFERENCES report(id) ON DELETE CASCADE ON UPDATE CASCADE
);
Then I tried to insert a report
final Dimension dimension = new Dimension();
dimension.setName("xyz");
dimension.setFilterIds(new Long[]{ 1L, 2L, 3L });
final Report report = new Report();
report.setName("xyz");
report.setDimensions(Collections.singleton(dimension));
repository.save(report);
where repository is simply a CrudRepository<Report, Long>.
This gave me the following error
org.postgresql.util.PSQLException: ERROR: column "filter_ids" is of type bigint[] but expression is of type bigint
Hinweis: You will need to rewrite or cast the expression.
Position: 116
Can I somehow tell Spring Data JDBC how to map the array types?

With the release of Spring Data JDBC 1.1.0, this became possible. See the documentation here:
The properties of the following types are currently supported:
All primitive types and their boxed types (int, float, Integer, Float, and so on)
Enums get mapped to their name.
String
java.util.Date, java.time.LocalDate, java.time.LocalDateTime, and java.time.LocalTime
Arrays and Collections of the types mentioned above can be mapped to columns of array type if your database supports that.
...

As P44T answered this should work from version of 1.1 of Spring Data JDBC onwards just as you used it.
Original answer
It is currently not possible. There are issues for this. A starting point is this one: https://jira.spring.io/browse/DATAJDBC-259

Related

Is it possible to Use Snowflake with Spring Boot / JPA / Hibernate

I am creating a service which writes directly to a snowflake database.
I am having a lot of trouble trying to get spring data jpa to work effectively with Snowflake. My main issue is that I am unable to save an entity to the Snowflake DB through Jpa Repository interface Save method. Because this application is being used to dump data into Snowflake, being able to leverage JPA would make life a lot easier.
I would prefer not to have to roll my own native queries so my question is whether it's possible to leverage Hibernate when working with Snowflake.
The main thing I want to be able to do is persist entities using the Jpa Repositories inbuild Save method.
Below is my current configuration. Any ideas on what could be improved in the configuration to get this working would be appreciated, or also any opinion on whether it is possible or not.
spring:
profiles:
active: local
application:
name: Service
datasource:
driverClassName: net.snowflake.client.jdbc.SnowflakeDriver
url: ${SPRING_DATASOURCE_URL}
username: ${SPRING_DATASOURCE_USERNAME}
password: ${SPRING_DATASOURCE_PASSWORD}
flyway:
locations: classpath:db/migration/common,classpath:db/migration/snowflake
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.SQLServerDialect
order_inserts: true
create sequence award_event_id_seq;
create table award_event
(
id INT NOT NULL DEFAULT award_event_id_seq.nextval PRIMARY KEY,
event_source_system varchar not null,
event_trigger VARCHAR NOT NULL,
event_triggered_by VARCHAR NOT NULL,
event_timestamp TIMESTAMP NOT NULL
)
#Entity(name = "award_event")
#SequenceGenerator(name = "award_event_id_seq", sequenceName = "award_event_id_seq", allocationSize = 1)
data class AwardEvent(
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE)
val id: Int = -1,
val eventTrigger: String,
val eventTriggeredBy: String,
val eventTimestamp: LocalDateTime,
val eventSourceSystem: String
)
override fun receiveMessage(message: String) {
logger.info("Receiving award event: $message")
val awardEvent: AwardEventMessage = message.toObject()
// This Save method does not work and throws an error specified below
awardEventRepository.save(awardEvent.toAwardEvent())
}
2021-01-08 10:49:28.163 ERROR 3239 --- [nio-9106-exec-1] o.hibernate.id.enhanced.TableStructure : could not read a hi value
net.snowflake.client.jdbc.SnowflakeSQLException: SQL compilation error:
syntax error line 1 at position 50 unexpected 'with'.
syntax error line 1 at position 72 unexpected ')'.
at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:124)
at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowException(SnowflakeUtil.java:64)
at net.snowflake.client.core.StmtUtil.pollForOutput(StmtUtil.java:434)
at net.snowflake.client.core.StmtUtil.execute(StmtUtil.java:338)
at net.snowflake.client.core.SFStatement.executeHelper(SFStatement.java:506)
at net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:233)
at net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:171)
at net.snowflake.client.core.SFStatement.execute(SFStatement.java:754)
at net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:245)
at net.snowflake.client.jdbc.SnowflakePreparedStatementV1.executeQuery(SnowflakePreparedStatementV1.java:117)
Just as a follow up, I was unable to get the application up and running using the approach I outlined above. I am still unsure why but think it may have been to do with a lack of support for snowflake sequences as the generation type for the primary key in spring.
I changed the generation type to UUID and the application started to work as expected in turn. There was no requirements for what type of primary key was needed so this approach was satisfactory.
create sequence award_event_id_seq;
create table award_event
(
id varchar not null constraint award_event_pkey primary key,
event_source_system varchar not null,
event_trigger varchar not null,
event_triggered_by varchar not null,
event_timestamp timestamp not null
)
#Entity(name = "award_event")
data class AwardEvent(
#Id
#GeneratedValue
#Type(type = "uuid-char")
val id: UUID = UUID.randomUUID(),
val eventTrigger: String,
val eventTriggeredBy: String,
val eventTimestamp: LocalDateTime,
val eventSourceSystem: String
)

Map jooq record data to multiple pojos

We have multiple tables like :
School one to many teacher
teacher one to many subject
teacher one to many classes
Entity are as follows
public class School {
private String name;
private long id;
private List<teacher> teachers;
public School() {
}
}
public class teachers {
private String name;
private Long id;
private List<Subject> subjects;
private List<Classes> classes;
}
public class Subject {
private String name;
private long id;
public Subject() {
}
}
public class Classes{
private String name;
private long id;
public Classes() {
}
}
we have written the jooq query for the required fields. For a single school data, we were getting multiple rows instead of one that was expected. However, We were unable to map the data.
We tried :
ModelMapper( Unable to find a way to covert multiple basically horizontal(table) records to vertical)
intoGroups() worked only till
single join(bw two tables)
simpleflatmapper same issue
Is there any way we can achieve it. Are we missing something?
PS: In response, We don't require all the columns(variable) from all the tables.
That's a tricky question for a school assignment, given that this has been, historically, one of jOOQ's most missing features :)
A jOOQ 3.15+ solution using MULTISET
In addition to the below SQL/XML or SQL/JSON based solution, jOOQ 3.15 now supports the standard SQL MULTISET value constructor operator as well as a synthetic MULTISET_AGG aggregate function, which can be used like this:
List<School> schools =
ctx.select(
SCHOOL.NAME,
SCHOOL.ID,
multisetAgg(
TEACHER.NAME,
TEACHER.ID,
multiset(
select(SUBJECT.NAME, SUBJECT.ID)
.from(SUBJECT)
.where(SUBJECT.TEACHER_ID.eq(TEACHER.ID))
).as("subjects").convertFrom(r -> r.map(Records.mapping(Subject::new))),
multiset(
select(CLASS.NAME, CLASS.ID)
.from(CLASS)
.where(CLASS.TEACHER_ID.eq(TEACHER.ID))
).as("classes").convertFrom(r -> r.map(Records.mapping(Classes::new)))
).as("teachers").convertFrom(r -> r.map(Records.mapping(Teachers::new)))
)
.from(SCHOOL)
.join(TEACHER).on(TEACHER.SCHOOL_ID.eq(SCHOOL.ID))
.groupBy(SCHOOL.NAME, SCHOOL.ID)
.fetch(Records.mapping(School::new));
The above approach using the various Records.mapping() overloads along with ad-hoc data type conversion assumes the presence of an immutable constructor, such as you'd get if your classes were Java 16 records:
record Subject (String name, long id) {}
A jOOQ 3.14+ solution using SQL/XML or SQL/JSON
Starting from jOOQ 3.14 and the new SQL/XML and SQL/JSON support, this will be possible relatively easily. In essence, you will be using your RDBMS's native XML or JSON support to nest collections directly in SQL. (All other approaches using joins and trying to deduplicate and shoe-horn flat result sets into nested data structures will not work well enough, as you've noticed)
You can write a query like this (assuming you use the code generator, and assuming you're interested in a tree structure with the School at the top):
List<School> schools =
ctx.select(jsonObject(
jsonEntry("name", SCHOOL.NAME),
jsonEntry("id", SCHOOL.ID),
jsonEntry("teachers", jsonArrayAgg(jsonObject(
jsonEntry("name", TEACHER.NAME),
jsonEntry("id", TEACHER.ID),
jsonEntry("subjects", field(
select(jsonArrayAgg(jsonObject(SUBJECT.NAME, SUBJECT.ID)))
.from(SUBJECT)
.where(SUBJECT.TEACHER_ID.eq(TEACHER.ID))
)),
jsonEntry("classes", field(
select(jsonArrayAgg(jsonObject(CLASS.NAME, CLASS.ID)))
.from(CLASS)
.where(CLASS.TEACHER_ID.eq(TEACHER.ID))
))
)))
))
.from(SCHOOL)
.join(TEACHER).on(TEACHER.SCHOOL_ID.eq(SCHOOL.ID))
.groupBy(SCHOOL.NAME, SCHOOL.ID)
.fetchInto(School.class);
This solution is based on assumptions of your schema, namely that there is a to-one relationship between both SUBJECT -> TEACHER and CLASS -> TEACHER.
Also, you can see I've still used a join to group TEACHER per SCHOOL, aggregating the teachers using JSON_ARRAYAGG(). That's one option, another correlated subquery as for the SUBJECT and CLASS queries would have been possible as well.
A simpler solution might be possible using SQL Server's FOR JSON clause, which can be emulated in other dialects.

Spring Boot 2 with Hibernate Search, indexes are not created on save

I've an entity defined like below. If I use save() Hibernate does not create a new index for newly created entity. Updating/modifying an existing entity works well and as expected.
I'm using kotling with spring boot 2.
#Entity(name = "shipment")
#Indexed
data class Shipment(
#Id #GeneratedValue(strategy = GenerationType.IDENTITY) val id: Long = -1,
#JoinColumn(name = "user") #ManyToOne() var user: User?,
#IndexedEmbedded
#JoinColumn(name = "sender") #ManyToOne(cascade = [CascadeType.ALL]) val sender: Contact,
#IndexedEmbedded
#JoinColumn(name = "sender_information") #ManyToOne(cascade = [CascadeType.ALL]) val senderInformation: ShipmentInformation,
) {}
Save function, I'm using this same function to update my entity and index is updated if index exists.
#Transactional
fun save(user: User, shipment: Shipment): Shipment {
shipment.user = user;
return this.shipmentRepository.save(shipment)
}
application.properties
spring.jpa.properties.hibernate.search.default.directory_provider=filesystem
spring.jpa.properties.hibernate.search.default.indexBase=./lucene/
spring.jpa.open-in-view=false
If I restart the server, indexing manually works too.
#Transactional
override fun onApplicationEvent(event: ApplicationReadyEvent) {
val fullTextEntityManager: FullTextEntityManager = Search.getFullTextEntityManager(entityManager)
fullTextEntityManager.createIndexer().purgeAllOnStart(true)
fullTextEntityManager.createIndexer().optimizeAfterPurge(true)
fullTextEntityManager.createIndexer().batchSizeToLoadObjects(15)
fullTextEntityManager.createIndexer().cacheMode(CacheMode.IGNORE)
fullTextEntityManager.createIndexer().threadsToLoadObjects(2)
fullTextEntityManager.createIndexer().typesToIndexInParallel(2)
fullTextEntityManager.createIndexer().startAndWait()
return
}
I tried to force to use JPA transaction manager but It did not help me.
#Bean(name = arrayOf("transactionManager"))
#Primary
fun transactionManager(#Autowired entityManagerFactory: EntityManagerFactory): org.springframework.orm.jpa.JpaTransactionManager {
return JpaTransactionManager(entityManagerFactory)
}
Update
I think I found why I don't get the results of newly inserted entities.
My search query has a condition on "pid" field which is declared:
#Field(index = Index.YES, analyze = Analyze.NO, store = Store.NO)
#SortableField
#Column(name = "id", updatable = false, insertable = false)
#JsonIgnore
#NumericField val pid: Long,
and query:
query.must(queryBuilder.keyword().onField("customer.pid").matching(user.customer.id.toString()).createQuery())
pid is not stored and so newly inserted values are not visible. Can this be the cause?
BTW: How can I query/search by nested indexed document id? In my case it is customer.id which is DocumentId. I've tried to change the query like below but don't get any result, should I create a new field to query?
query.must(queryBuilder.keyword().onField("customer.id").matching(user.customer.id.toString()).createQuery())
Update 2
I found a solution and now getting the newly inserted datas too. There was an error with definition of "pid" field and I've defined my Fields as below and it works as expected.
#Fields(
Field(name = "pid", index = Index.YES, analyze = Analyze.YES, store = Store.NO)
)
#SortableField(forField = "pid")
#Id #GeneratedValue(strategy = GenerationType.IDENTITY) val id: Long?,
Can we search and sort by id in an easy way or is it the best practice? I know that we should use native JPA functions to get results by id but in my case I need to search by an embedded id to restrict search results. (depends on role of user) so therefore it is not an option for me.
And I don't understand why manual indexing works...
BTW: How can I query/search by nested indexed document id? In my case it is customer.id which is DocumentId. I've tried to change the query like below but don't get any result, should I create a new field to query?
Normally you don't need to create a separate field if all you want is to perform an exact match.
Can we search and sort by id in an easy way
Searching, yes, at least in Hibernate Search 5.
Sorting, no: you need a dedicated field.
or is it the best practice?
The best practice is to declare a field alongside your #DocumentId if you need anything more complex than an exact match on the ID.
I know that we should use native JPA functions to get results by id
I'm not sure I understand what you mean by "native JPA functions".
but in my case I need to search by an embedded id to restrict search results. (depends on role of user)
Yes, this should work. That is, it should work if the id is properly populated.
And I don't understand why manual indexing works...
Neither do I, but I suppose the explanation lies in the "error in the definition of "pid" field". Maybe the ID wasn't populated properly in some cases, leading to the entity being considered as deleted by Hibernate Search?
If you need me to give you a definitive answer, the best way to get it would be to create a reproducer. You can use this as a template: https://github.com/hibernate/hibernate-test-case-templates/tree/master/search
This looks odd:
#Id #GeneratedValue(strategy = GenerationType.IDENTITY) val id: Long = -1,
I'd expect a nullable long, initialized to null (or whatever is the Kotlin equivalent).
I'm not sure this is the problem, but I imagine it could be, as a non-null ID is generally only expected from an already persisted entity.
Other than that, I think you're on the right track: if mass indexing works but not automatic indexing, it may have something to do with your changes not being executed in database transactions.

Spring data solr, How to force numeric-looking string field to be solr string type

I'm trying to use spring-data-solr:3.0.6 to index data from different source, there is one field, casenumber having different format. When casenumber has ONLY digits, say 123, spring-data-solr will index the field as plong. That not causes problem until later on, a record with casenumber “CASE456”. Solr engine throw error, of course, casenumber must be long
Can I let spring data know "123" is string, not guess it as number without touch schema? I like the schemaless mode. I have tried the following code, spring-data-solr just index “123” as 123. There is little document about #Indexed/type. Thanks
#SolrDocument(collection =..)
public class CaseDocument
{
#Indexed(type="string")
private String caseNumber;
// OR
#Indexed(type="lowercase")
private String caseNumber;
....

JPA/Hibernate generating wrong SQL in Spring Roo finder method

I'm developing a Spring web application whose persistence layer consists in Spring Roo generated JPA entities, with Hibernate as persistence provider and MySql as underlying DB.
Among my entities I have a class Detection with a tstamp java.util.Date field generated in Roo as follows:
entity jpa --class ~.data.Detection
...
field date --fieldName tstamp --type java.util.Date
...
finder add findDetectionsByTstampBetween
(the finder method was of course chosen after executing finder list)
In my controller code, at a point I invoke:
List<Detection> detections = Detection.findDetectionsByTstampBetween(from, to).getResultList();
Where from and to are two valid java.util.Date(s). When testing sample data though (after ensuring that for a given choice of from, to the returned list shouldn't be empty), I got an empty list and investigated the reasons.
I found in tomcat logs that Hibernate was generating the following SQL:
Hibernate: select detection0_.id as id1_3_, ...etc..., detection0_.tstamp as tstamp4_3_ from detection detection0_ where detection0_.tstamp>=?
I would expect the where clause should contain a trailing "AND detection0_.tstamp<=?", checking the other date range limit. I took a look at the generated Detection.findDetectionsByTstampBetween(Date minTstamp, Date maxTstamp) method in Detection_Roo_Finder.aj and actually the "AND" is present in the invocation to createQuery.
public static TypedQuery<Detection> Detection.findDetectionsByTstampBetween(Date minTstamp, Date maxTstamp) {
if (minTstamp == null) throw new IllegalArgumentException("The minTstamp argument is required");
if (maxTstamp == null) throw new IllegalArgumentException("The maxTstamp argument is required");
EntityManager em = Detection.entityManager();
TypedQuery<Detection> q = em.createQuery("SELECT o FROM Detection AS o WHERE o.tstamp BETWEEN :minTstamp AND :maxTstamp", Detection.class);
q.setParameter("minTstamp", minTstamp);
q.setParameter("maxTstamp", maxTstamp);
return q;
}
Any idea what could cause the problem?
I've finally found the solution to the riddle and, as it turned out, the issue had nothing to do with JPA.
The problem was that the call to the persistence layer was inserted inside a Rest service controller with the following mapping:
#ResponseBody
#RequestMapping(value="/detections", method=RequestMethod.GET, params="from, to" )
public Object getDetectionsInRange(
#RequestParam(required=true) #DateTimeFormat(pattern="yyyy-MM-dd HH:mm") final Date from,
#RequestParam(required=true) #DateTimeFormat(pattern="yyyy-MM-dd HH:mm") final Date to
)
{
...
List<Detection> detections = Detection.findDetectionsByTstampBetween(from, to).getResultList();
...
}
The error was in the definition of the params= argument in #RequestMapping, the correct format being as follows:
#RequestMapping(value="/detections", method=RequestMethod.GET, params={"from", "to"} )
This error caused another version of the controller method for /detections. In this second version I called a different finder method, which appeared to generate the wrong SQL in Hibernate.
#ResponseBody
#RequestMapping(value="/detections", method=RequestMethod.GET )
public Object getDetections(
#RequestParam(required=false, defaultValue="0") int days,
#RequestParam(required=false, defaultValue="0") int hours,
#RequestParam(required=false, defaultValue="0") int minutes
)
{
...
List<Detection> detections = Detection.findDetectionsByTstampGreaterThanEquals( ... ).getResultList();
...
}

Resources