I am creating a service which writes directly to a snowflake database.
I am having a lot of trouble trying to get spring data jpa to work effectively with Snowflake. My main issue is that I am unable to save an entity to the Snowflake DB through Jpa Repository interface Save method. Because this application is being used to dump data into Snowflake, being able to leverage JPA would make life a lot easier.
I would prefer not to have to roll my own native queries so my question is whether it's possible to leverage Hibernate when working with Snowflake.
The main thing I want to be able to do is persist entities using the Jpa Repositories inbuild Save method.
Below is my current configuration. Any ideas on what could be improved in the configuration to get this working would be appreciated, or also any opinion on whether it is possible or not.
spring:
profiles:
active: local
application:
name: Service
datasource:
driverClassName: net.snowflake.client.jdbc.SnowflakeDriver
url: ${SPRING_DATASOURCE_URL}
username: ${SPRING_DATASOURCE_USERNAME}
password: ${SPRING_DATASOURCE_PASSWORD}
flyway:
locations: classpath:db/migration/common,classpath:db/migration/snowflake
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.SQLServerDialect
order_inserts: true
create sequence award_event_id_seq;
create table award_event
(
id INT NOT NULL DEFAULT award_event_id_seq.nextval PRIMARY KEY,
event_source_system varchar not null,
event_trigger VARCHAR NOT NULL,
event_triggered_by VARCHAR NOT NULL,
event_timestamp TIMESTAMP NOT NULL
)
#Entity(name = "award_event")
#SequenceGenerator(name = "award_event_id_seq", sequenceName = "award_event_id_seq", allocationSize = 1)
data class AwardEvent(
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE)
val id: Int = -1,
val eventTrigger: String,
val eventTriggeredBy: String,
val eventTimestamp: LocalDateTime,
val eventSourceSystem: String
)
override fun receiveMessage(message: String) {
logger.info("Receiving award event: $message")
val awardEvent: AwardEventMessage = message.toObject()
// This Save method does not work and throws an error specified below
awardEventRepository.save(awardEvent.toAwardEvent())
}
2021-01-08 10:49:28.163 ERROR 3239 --- [nio-9106-exec-1] o.hibernate.id.enhanced.TableStructure : could not read a hi value
net.snowflake.client.jdbc.SnowflakeSQLException: SQL compilation error:
syntax error line 1 at position 50 unexpected 'with'.
syntax error line 1 at position 72 unexpected ')'.
at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:124)
at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowException(SnowflakeUtil.java:64)
at net.snowflake.client.core.StmtUtil.pollForOutput(StmtUtil.java:434)
at net.snowflake.client.core.StmtUtil.execute(StmtUtil.java:338)
at net.snowflake.client.core.SFStatement.executeHelper(SFStatement.java:506)
at net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:233)
at net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:171)
at net.snowflake.client.core.SFStatement.execute(SFStatement.java:754)
at net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:245)
at net.snowflake.client.jdbc.SnowflakePreparedStatementV1.executeQuery(SnowflakePreparedStatementV1.java:117)
Just as a follow up, I was unable to get the application up and running using the approach I outlined above. I am still unsure why but think it may have been to do with a lack of support for snowflake sequences as the generation type for the primary key in spring.
I changed the generation type to UUID and the application started to work as expected in turn. There was no requirements for what type of primary key was needed so this approach was satisfactory.
create sequence award_event_id_seq;
create table award_event
(
id varchar not null constraint award_event_pkey primary key,
event_source_system varchar not null,
event_trigger varchar not null,
event_triggered_by varchar not null,
event_timestamp timestamp not null
)
#Entity(name = "award_event")
data class AwardEvent(
#Id
#GeneratedValue
#Type(type = "uuid-char")
val id: UUID = UUID.randomUUID(),
val eventTrigger: String,
val eventTriggeredBy: String,
val eventTimestamp: LocalDateTime,
val eventSourceSystem: String
)
Related
Im trying to use hibernate envers..
I have tables annotated with #Audited, but problem occurs with hibernate_sequence..
CREATE SEQUENCE IF NOT EXISTS custom_schema.hibernate_sequence START 1 INCREMENT 1;
So Im using custom_schema and obviously problem is
ERROR: relation "hibernate_sequence" does not exist
So Im asking if it is possible to somehow tell hibernate in which schema this sequence is?
Ok, solved by creating own revisionentity
#RevisionEntity
#Entity
#Table(name = "revinfo", schema = "custom_schema")
class RevInfo (
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#RevisionNumber
var id: Long? = null,
#RevisionTimestamp
var timestamp: Long = 0
)
Given the following entity/table defined in a Spring/KotlinCoroutines project.
#Table("workers")
data class Worker(
#Id
val id: UUID? = null,
#Column(value = "photo")
var photo: String? = null,
// see: https://github.com/spring-projects/spring-data-r2dbc/issues/449
#Transient
#Value("#{root.photo!=null}")
val hasPhoto: Boolean = false
)
The hasPhoto field does not map to a table field. I follow the R2dbc official reference doc and use a Spring EL to evaluate the EL result as value of this field.
But when I test this hasPhoto, it always returns false even I set the photo to a nonnull string.
Got answer from the Spring guys, #Value here only can access the projection.
I have a small project to tinker with Spring, where I have two entities with a one to many association: 1 Restaurant -> N Dishes.
I have the following PostgreSQL schema for that:
create table if not exists restaurants (
restaurant_id uuid primary key,
name varchar(512) not null,
description varchar(1024) not null,
address varchar(512) not null,
photo_url varchar(1024)
);
create table if not exists dishes (
dish_id uuid primary key,
name varchar(512) not null,
description varchar(1024),
photo_url varchar(1024),
restaurant_id uuid references restaurants(restaurant_id) not null,
price int not null check (price > 0)
);
With the following JPA Entities:
#Entity
#Table(name = "restaurants")
class Restaurants(
#Id
var restaurantId: UUID,
var name: String,
var description: String,
var photoUrl: String?,
) {
#OneToMany(mappedBy = "restaurant")
#JoinColumn(name = "restaurant_id", nullable = false)
var dishes: MutableList<Dishes> = mutableListOf()
}
#Entity
#Table(name = "dishes")
class Dishes(
#Id
var dishId: UUID,
var name: String,
var description: String,
var photoUrl: String?,
var price: Int,
#ManyToOne(optional = false)
#JoinColumn(name = "restaurant_id", nullable = false)
var restaurant: Restaurants
)
I have defined a RestaurantsRepository as follows:
interface RestaurantsRepository: R2dbcRepository<Restaurants, UUID> {
fun findByRestaurantId(restaurantId: UUID): Mono<Restaurants>
}
The problem I'm having is that when I call findByRestaurantId I have the following exception:
org.springframework.r2dbc.BadSqlGrammarException: executeMany; bad SQL grammar [SELECT restaurants.restaurant_id, restaurants.name, restaurants.description, restaurants.photo_url, restaurants.dishes FROM restaurants WHERE restaurants.restaurant_id = $1]; nested exception is io.r2dbc.postgresql.ExceptionFactory$PostgresqlBadGrammarException: [42703] column restaurants.dishes does not exist
at org.springframework.r2dbc.connection.ConnectionFactoryUtils.convertR2dbcException(ConnectionFactoryUtils.java:235) ~[spring-r2dbc-5.3.21.jar:5.3.21]
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Why is the #OneToMany field included in the SQL query?
You are trying to use Spring Data R2DBC (R2dbcRepository) in conjunction with JPA annotations. It won't work: these are two different technologies. R2DBC does not support #ManyToOne nor #JoinColumn so the annotations are simply ignored.
I am trying to use Room in Android Studio using Kotlin with a pre-packaged database. The database does not set NOT NULL. Using DB SQL Browser it shows that the column has these properties
"Reference" TEXT
There is no NOT NULL. All the other columns in the table do have NOT NULL set.
In the Entity that maps that table I have:
#Entity
data class Meaning (
#PrimaryKey(autoGenerate = true) #ColumnInfo(name = "Id") val id: Int,
#NonNull #ColumnInfo(name = "Contents") val contents: String,
/*
* It's OK for Reference to be Null
*/
#ColumnInfo(name = "Reference") val reference: String,
#NonNull #ColumnInfo(name = "SymbolId") val symbolId: Int,
#NonNull #ColumnInfo(name = "Local") val local: Int
)
It builds and installs, but fails when running with this error:
java.lang.IllegalStateException: Pre-packaged database has an invalid schema: Meaning(<stuff>.Meaning).
Expected:
TableInfo{name='Meaning', <unimportant columns>, Reference=Column{name='Reference', type='TEXT', affinity='2', notNull=true, primaryKeyPosition=0, defaultValue='null'},<more unimportant columns>}
Found:
TableInfo{name='Meaning', <unimportant columns>, Reference=Column{name='Reference', type='TEXT', affinity='2', notNull=false, primaryKeyPosition=0, defaultValue='null'}, <more unimportant columns}
Note that the Found has notNull=false, which seems correct because NOT NULL is not specified in the database.
The Expected has notNull=true even though in the Entity, #NonNull was not specified for the Reference column.
So, I am confused why the Entity is expecting Reference column to be notNull=true.
Any pointers are welcome.
#ColumnInfo(name = "Reference") val reference: String?,
would equate to notNull = false.
i.e. the ? indicates null allowed.
So, I am confused why the Entity is expecting Reference column to be notNull=true.
Because the annotation processing sees String which cannot be null. Only if it sees String? then can the value be nullable.
That is if you you use:-
#Entity
data class Meaning (
#PrimaryKey(autoGenerate = true) #ColumnInfo(name = "Id") val id: Int,
#NonNull #ColumnInfo(name = "Contents") val contents: String,
/*
* It's OK for Reference to be Null
*/
#ColumnInfo(name = "Reference") val reference: String?,
#NonNull #ColumnInfo(name = "SymbolId") val symbolId: Int,
#NonNull #ColumnInfo(name = "Local") val local: Int
)
and then compile, the generated Java (expected) for the #Database annotated class suffixed with _Impl includes:-
_db.execSQL("CREATE TABLE IF NOT EXISTS `Meaning` (`Id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `Contents` TEXT NOT NULL, `Reference` TEXT, `SymbolId` INTEGER NOT NULL, `Local` INTEGER NOT NULL)");
i.e.
, `Reference` TEXT,
However without the ? then :-
_db.execSQL("CREATE TABLE IF NOT EXISTS `Meaning` (`Id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `Contents` TEXT NOT NULL, `Reference` TEXT NOT NULL, `SymbolId` INTEGER NOT NULL, `Local` INTEGER NOT NULL)");
i.e.
`Reference` TEXT NOT NULL,
I am trying to use Spring Data JDBC for my PostgreSQL database. I defined the following beans
#Data
class Report {
#Id
private Long id;
private String name;
private Set<Dimension> dimensions;
}
#Data
class Dimension {
private String name;
private Long[] filterIds;
}
and the corresponding DDL
CREATE TABLE report (
id bigserial PRIMARY KEY,
name text NOT NULL
);
CREATE TABLE dimension (
id bigserial PRIMARY KEY ,
report bigint,
name text,
filter_ids bigint[],
FOREIGN KEY (report) REFERENCES report(id) ON DELETE CASCADE ON UPDATE CASCADE
);
Then I tried to insert a report
final Dimension dimension = new Dimension();
dimension.setName("xyz");
dimension.setFilterIds(new Long[]{ 1L, 2L, 3L });
final Report report = new Report();
report.setName("xyz");
report.setDimensions(Collections.singleton(dimension));
repository.save(report);
where repository is simply a CrudRepository<Report, Long>.
This gave me the following error
org.postgresql.util.PSQLException: ERROR: column "filter_ids" is of type bigint[] but expression is of type bigint
Hinweis: You will need to rewrite or cast the expression.
Position: 116
Can I somehow tell Spring Data JDBC how to map the array types?
With the release of Spring Data JDBC 1.1.0, this became possible. See the documentation here:
The properties of the following types are currently supported:
All primitive types and their boxed types (int, float, Integer, Float, and so on)
Enums get mapped to their name.
String
java.util.Date, java.time.LocalDate, java.time.LocalDateTime, and java.time.LocalTime
Arrays and Collections of the types mentioned above can be mapped to columns of array type if your database supports that.
...
As P44T answered this should work from version of 1.1 of Spring Data JDBC onwards just as you used it.
Original answer
It is currently not possible. There are issues for this. A starting point is this one: https://jira.spring.io/browse/DATAJDBC-259