Hibernate diffChangeLog producing change sets that exist already in the table - spring

I am using spring boot with hibernate 5.3.8. After I run the first schema I generate a diffChangeLog and its dropping and creating the same index. When I look at the collection_days table a common_collection_search exists
How can I mark it as completed so it no longer appears in the diffChangeLog. I only want generate a changelog with the newest changes so I can run it and update the database
collection_days table
index_name|index_algorithm|is_unique|column_name|condition
common_collection_search|BTREE|f|waste_generator_id,refuse_type|NULL
This is from my master schema that I ran on a fresh database
- changeSet:
id: 1572649828026-46
author: comp (generated)
changes:
- createIndex:
columns:
- column:
name: wg_id
- column:
name: type
indexName: common_collection_search
tableName: collection_days
diffChangeLog results
- changeSet:
id: 1615578168770-16
author: comp (generated)
changes:
- dropIndex:
indexName: common_collection_search
tableName: collection_days
- changeSet:
id: 1615578168770-17
author: comp (generated)
changes:
- createIndex:
columns:
- column:
name: wg_id
- column:
name: type
indexName: common_collection_search
tableName: collection_days
CollectionDay entity
#Data
#Entity
#Table(name = "collection_days", uniqueConstraints = {
#UniqueConstraint(columnNames = { "wg_id", "type", "day_of_week" }) }, indexes = {
#Index(name = "common_collection_search", columnList = "wg_id,type") })
public class CollectionDay implements Serializable {
#Column(name = "wg_id",updatable = false,insertable = false)
private Long wgId;
#Column(name = "type")
#Enumerated(EnumType.STRING)
private ESEnums.type type;
build.gradle
compile "org.hibernate:hibernate-core"
compile "org.hibernate:hibernate-entitymanager"
compile "org.hibernate:hibernate-envers"
compile "org.springframework.boot:spring-boot-starter-data-jpa"
compile "com.zaxxer:HikariCP"
compile "org.hibernate:hibernate-ehcache"
compile 'com.fasterxml.jackson.datatype:jackson-datatype-hibernate5'
compile "org.postgresql:postgresql"
compile "org.liquibase:liquibase-core"
compile "org.hibernate:hibernate-validator:6.1.6.Final"
liquibaseRuntime 'org.liquibase.ext:liquibase-hibernate5:3.8'
liquibaseRuntime sourceSets.main.output
liquibaseRuntime sourceSets.main.compileClasspath
liquibase {
activities {
main {
driver 'org.postgresql.Driver'
url 'jdbc:postgresql://localhost:54323/db2?stringtype=unspecified'
username ''
password ''
changeLogFile "src/main/resources/liquibase/migrations/changelog.yaml"
classpath 'src/main/java'
referenceUrl 'hibernate:spring:com.project.domain.entities?dialect=org.hibernate.dialect.PostgreSQL82Dialect&hibernate.physical_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringPhysicalNamingStrategy&hibernate.implicit_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy'
}
}
}

I think the problem is that PostgreSQL is smart enough to realize that the index is unnecessary and doesn't create it in the way you expect it. Your unique constraint already covers the columns of the index and since it is backed by an index, it will just tell you that it won't create your index, because it is already covered.

Related

How to connect to remote oracle database using typeorm in nestjs?

I was wondering how to connect to remote oracle database from nestjs using typeorm.
I installed typeorm and oracle package using following command.
npm i --save #nestjs/typeorm typeorm oracle
npm install oracledb --save
and then tried configuring in app.module.ts using TypeOrmModule.forRoot but it was not succesfull.
Here are my configuration settings.
TypeOrmModule.forRoot({
type: 'oracle',
host: 'ip of hostname',
port: port number,
username: 'username',
password: 'password',
serviceName: 'servicename',
synchronize: false,
entities: []
})
Can anybody help me out what am I missing? Also would like to know how can I execute the query once this connection is succesfully? If any example that would be helpfull.
Got it.
one missing thing was database name.
Added
database: 'databasename' in above configuration and it worked.
But, still my question is how to use this connection in service to fetch/push the data from/to oracle databse?
If you provide a name in your connection details you should be able to refer to the database connection using that. Otherwise, if no name is provided I believe it assigns it the name 'default'.
Basically these are the steps you should perform to use the database connection: (examples below each)
Create a model - this is how TypeORM knows to create a table.
export class Photo {
id: number
name: string
description: string
filename: string
views: number
isPublished: boolean
}
Create an Entity. - this should match your model, with the appropriate decorators. At minimum you should have the #Entity() decorator before your class definition and #Column() before each field.
import { Entity, Column } from "typeorm"
#Entity()
export class Photo {
#Column()
id: number
#Column()
name: string
#Column()
description: string
#Column()
filename: string
#Column()
views: number
#Column()
isPublished: boolean
}
Create your data source - looks like you have already done this. But I would give it a name field and you will need to pass your entities into the entity array you have.
const AppDataSource = new DataSource({
type: "postgres",
name: "photos",
host: "localhost",
port: 5432,
username: "root",
password: "admin",
database: "test",
entities: [Photo],
synchronize: true,
logging: false,
})
Then you can use repositories to manage data in the database:
const photo = new Photo()
photo.name = "Me and Bears"
photo.description = "I am near polar bears"
photo.filename = "photo-with-bears.jpg"
photo.views = 1
photo.isPublished = true
const photoRepository = AppDataSource.getRepository(Photo)
await photoRepository.save(photo)
console.log("Photo has been saved")
const savedPhotos = await photoRepository.find()
console.log("All photos from the db: ", savedPhotos)
For more details I would spend some time reading through the typeORM website, all the examples I pulled were from there:
https://typeorm.io/

OpenApi - Is there a way to have a ComposedSchema with a discriminator part in a contract generated with springdoc-openapi-maven-plugin?

I have a sample SpringBoot API with the following features:
1 controller that exposes a single endpoint invokable with a GET request and that returns a custom class (ContainerClass in my example)
ContainerClass contains a property List
ParentClass is an abstract class that has 2 sub-classes: ChildA and ChildB
I try to generate an OpenApi contract from this API with springdoc-openapi-maven-plugin.
In my pom.xml, I have the following elements:
SpringBoot version: 2.2.6
org.springdoc:springdoc-openapi-ui:1.4.1
org.springdoc:springdoc-openapi-maven-plugin:1.0
Here are my classes I generate schema from.
import io.swagger.v3.oas.annotations.media.ArraySchema;
import io.swagger.v3.oas.annotations.media.Schema;
public class ContainerClass {
#ArraySchema(
arraySchema = #Schema(discriminatorProperty = "classType"),
schema = #Schema(implementation = ParentClass.class)
)
public List<ParentClass> elements;
// + Getter/Setter
}
#JsonTypeInfo(
use = JsonTypeInfo.Id.NAME,
include = JsonTypeInfo.As.EXISTING_PROPERTY,
property = "classType",
defaultImpl = ParentClass.class,
visible = true)
#JsonSubTypes({
#JsonSubTypes.Type(value = ChildA.class, name = "CHILD_A"),
#JsonSubTypes.Type(value = ChildB.class, name = "CHILD_B")})
#Schema(
description = "Parent description",
discriminatorProperty = "classType",
discriminatorMapping = {
#DiscriminatorMapping(value = "CHILD_A", schema = ChildA.class),
#DiscriminatorMapping(value = "CHILD_B", schema = ChildB.class)
}
)
public abstract class ParentClass {
public String classType;
// + Getter/Setter
}
#io.swagger.v3.oas.annotations.media.Schema(description = " Child A", allOf = ParentClass.class)
public class ChildA extends ParentClass{
}
#io.swagger.v3.oas.annotations.media.Schema(description = " Child B", allOf = ParentClass.class)
public class ChildB extends ParentClass{
}
When I run springdoc-openapi-maven-plugin, I get the following contract file.
openapi: 3.0.1
info:
title: OpenAPI definition
version: v0
servers:
- url: http://localhost:8080
description: Generated server url
paths:
/container:
get:
tags:
- hello-controller
operationId: listElements
responses:
"200":
description: OK
content:
'*/*':
schema:
$ref: '#/components/schemas/ContainerClass'
components:
schemas:
ChildA:
type: object
description: ' Child A'
allOf:
- $ref: '#/components/schemas/ParentClass'
ChildB:
type: object
description: ' Child B'
allOf:
- $ref: '#/components/schemas/ParentClass'
ContainerClass:
type: object
properties:
elements:
type: array
description: array schema description
items:
oneOf:
- $ref: '#/components/schemas/ChildA'
- $ref: '#/components/schemas/ChildB'
ParentClass:
type: object
properties:
classType:
type: string
description: Parent description
discriminator:
propertyName: classType
mapping:
CHILD_A: '#/components/schemas/ChildA'
CHILD_B: '#/components/schemas/ChildB'
Actually, in my context, in order to have not any breaking change with existing consumers, I need items property in ContainerClass schema to contain the discriminator part that is contained in ParentClass schema, like this:
ContainerClass:
type: object
properties:
elements:
type: array
description: array schema description
items:
discriminator:
propertyName: classType
mapping:
CHILD_A: '#/components/schemas/ChildA'
CHILD_B: '#/components/schemas/ChildB'
oneOf:
- $ref: '#/components/schemas/ChildA'
- $ref: '#/components/schemas/ChildB'
When I try to set properties in annotation, I don't manage to do that. And when I debug code of io.swagger.v3.core.jackson.ModelResolver, I don't manage to find a way to do that.
And so far I have not found an example of code that help me.
Is there a way so that a ComposedSchema (array contained in ContainerClass in my case) has a disciminator part generated by springdoc-openapi-maven-plugin execution?
This the default generation structure. Handled directly by swagger-api (and not springdoc-openapi.
The generated OpenAPI description looks coorect.
With springdoc-openapi, you can define an OpenApiCustomiser Bean, where you can change the elements of the components element defined on the OpenAPI level:
https://springdoc.org/faq.html#how-can-i-customise-the-openapi-object-
Here is my solution by defining an OpenApiCustomiser Bean:
#Bean
public OpenApiCustomiser myCustomiser() {
Map<String, String> classTypeMapping = Map.ofEntries(
new AbstractMap.SimpleEntry<String, String>("CHILD_A", "#/components/schemas/ChildA"),
new AbstractMap.SimpleEntry<String, String>("CHILD_B", "#/components/schemas/ChildB")
);
Discriminator classTypeDiscriminator = new Discriminator().propertyName("classType")
.mapping(classTypeMapping);
return openApi -> openApi.getComponents().getSchemas().values()
.stream()
.filter(schema -> "ContainerClass".equals(schema.getName()))
.map(schema -> schema.getProperties().get("elements"))
.forEach(arraySchema -> ((ArraySchema)arraySchema).getItems().discriminator(classTypeDiscriminator));
}
I get the expected result in my contract file.

Liquibase groovy accessomg datebaseChangeLog property

I have the following Liquibase script written in Groovy.
package data.db
databaseChangeLog {
// H2
property(name: "date", value: "DATETIME", dbms: "h2")
property(name: "integer", value: "INTEGER", dbms: "h2")
property(name: "bigint", value: "BIGINT", dbms: "h2")
property(name: "current_date", value: "NOW()", dbms: "h2")
property(name: "current_timestamp", value: "NOW()", dbms: "h2")
// TABLES
include(file: "tables/2017-06-22-001-user-account-tables.groovy", relativeToChangelogFile: true)
}
I am using Gradle for build and I included compile "org.liquibase:liquibase-groovy-dsl:1.2.2" so the script itself works.
However, I don't know how I can access these databaseChangeLog properties inside the script. I could't find and documentation or examples on how to do it.
Using xml it is pretty straightforward, here is the documentation with an example.
How do I do this using Groovy?
Ok, I found a solution that works, but seems a bit unwieldy. Please recommend something better if there is such a thing:
final DatabaseChangeLog dcl = (DatabaseChangeLog) properties['databaseChangeLog'];
final String bigintType = dcl.changeLogParameters.getValue("bigint", dcl)
Here is a bit of context:
package data.db.tables
import liquibase.changelog.DatabaseChangeLog
databaseChangeLog {
final DatabaseChangeLog dcl = (DatabaseChangeLog) properties['databaseChangeLog'];
final String bigintType = dcl.changeLogParameters.getValue("bigint", dcl)
changeSet(id: "2017-06-22-001-user-account-tables", author: "goranmrzljak") {
comment("User account tables")
createTable(tableName: "user_account_permission") {
column(name: "id", type: bigintType) {
constraints(primaryKey: true, primaryKeyName: "user_account_permission_pk")
}
// ...
}
// ...
}
}
This works both in the same file or databaseChangeLog, or in a different file or databaseChangeLog from where the properties were defined.

Internationalization for user defined input

I'm trying to build a localization for user defined input. Example: a user could define categories like soccer but in several languages.
The model entity could have a filed something like this:
#ManyToMany
#MapKeyColumn(name = "locale", insertable = false, updatable = false)
public Map<String, L18n> titles;
I'd like to store the localized strings like this:
#Entity
public class L18n {
#Id
#Constraints.Required
#Formats.NonEmpty
public Integer id;
public String key;
public String locale;
#Column(columnDefinition = "TEXT")
public String text;
}
We use yaml to store the testdata:
category:
- !!models.Category
...
titles:
- !!models.L18n
key: soccer
l18n:
- !!models.L18n
key: soccer
locale: de-CH
text: fdfdfsee
- !!models.L18n
key: soccer
locale: fr-CH
text: dlfkjsdlfj
With this solution I'm getting this error:
[error] Caused by: org.yaml.snakeyaml.error.YAMLException: No suitable constructor with 1 arguments found for interface java.util.Map
[error] at org.yaml.snakeyaml.constructor.Constructor$ConstructSequence.construct(Constructor.java:574)
[error] at org.yaml.snakeyaml.constructor.BaseConstructor.constructObject(BaseConstructor.java:182)
[error] at org.yaml.snakeyaml.constructor.Constructor$ConstructMapping.constructJavaBean2ndStep(Constructor.java:296)
[error] ... 65 more
I think the problem is you've defined titles as a map, but don't use the YAML mapping syntax to define it. YAML Collections
I think the syntax for titles should look like the example here using langauges: YAML Dictionary Example
titles:
soccer: - !!models.L18n
key: soccer
locale: de-CH
text: fdfdfsee
football: - !!models.L18n
...
If you change the format of your .yml file so that the L18n instances are defined first and use references, the YAML spec calls them Alias Nodes, you could make your titles look a lot more streamlined:
l18n:
- &soccerDE !!models.L18n
key: soccer
locale: de-CH
text: fdfdfsee
- &soccerFR !!models.L18n
key: soccer
locale: fr-CH
text: dlfkjsdlfj
category:
- !!models.Category
...
titles:
soccerDE: *soccerDE
soccerFR: *soccerFR

ValidationException on Update: Validation error whilst flushing entity on AbstractPersistenceEventListener

In my environment, i have grails.gorm.failOnError = true on Config.groovy.
package org.example
class Book {
String title
String author
String email
static constraints = {
title nullable: false, blank: false
email nullable: false, blank: false, unique: true //apparently this is the problem..
}
}
And, on controller, i have:
package org.example
class BookController {
def update() {
def bookInstance = Book.get(params.id)
if (!bookInstance) {
flash.message = message(code: 'default.not.found.message', args: [message(code: 'book.label', default: 'Book'), params.id])
redirect(action: "list")
return
}
if (params.version) {
def version = params.version.toLong()
if (bookInstance.version > version) {
bookInstance.errors.rejectValue("version", "default.optimistic.locking.failure",
[message(code: 'book.label', default: 'Book')] as Object[],
"Another user has updated this Book while you were editing")
render(view: "edit", model: [bookInstance: bookInstance])
return
}
}
bookInstance.properties = params
bookInstance.validate()
if(bookInstance.hasErrors()) {
render(view: "edit", model: [bookInstance: bookInstance])
} else {
bookInstance.save(flush: true)
flash.message = message(code: 'default.updated.message', args: [message(code: 'book.label', default: 'Book'), bookInstance.id])
redirect(action: "show", id: bookInstance.id)
}
}
}
To save, it's ok. But, when updating without set the title field, i get:
Message: Validation error whilst flushing entity [org.example.Book]:
- Field error in object 'org.example.Book' on field 'title': rejected value []; codes [org.example.Book.title.blank.error.org.example.Book.title,org.example.Book.title.blank.error.title,org.example.Book.title.blank.error.java.lang.String,org.example.Book.title.blank.error,book.title.blank.error.org.example.Book.title,book.title.blank.error.title,book.title.blank.error.java.lang.String,book.title.blank.error,org.example.Book.title.blank.org.example.Book.title,org.example.Book.title.blank.title,org.example.Book.title.blank.java.lang.String,org.example.Book.title.blank,book.title.blank.org.example.Book.title,book.title.blank.title,book.title.blank.java.lang.String,book.title.blank,blank.org.example.Book.title,blank.title,blank.java.lang.String,blank]; arguments [title,class org.example.Book]; default message [Property [{0}] of class [{1}] cannot be blank]
Line | Method
->> 46 | onApplicationEvent in org.grails.datastore.mapping.engine.event.AbstractPersistenceEventListener
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 895 | runTask in java.util.concurrent.ThreadPoolExecutor$Worker
| 918 | run . . . . . . . in ''
^ 680 | run in java.lang.Thread
At q I understand it, the problem occurs when the flush hibernate session, hibernate tries to save the object again then the exception is thrown...
When trying to save the object again, is called the book.validate () again, which makes a new query in the database to ensure the uniqueness of the email field. Right now, the Validation Exception is thrown.
But, when i removed the unique validation of email property, the update is performed normally..
My question is: This behavior is correct? Hibernate calls book.save automatically?
This is the sample project, and the steps to simulate the error are:
source: https://github.com/roalcantara/grails_app_validation_exception
grails run-app
navigate to http:// localhost: 8080/ book/book/create
create an new instance filling all fields..
then edit this instance, in: http:// localhost: 8080/ book/book/edit/1
finally, drop the 'Title' field and click on Update, then the exception is thrown..
In my environment, this behavior has occurred on grails version 2.0.3 and 2.2.1
Thanks for any help! And sorry by my poor (and shame) english.. rs..
You are essentially validating twice, first with:
bookInstance.validate()
and second with:
bookInstance.save(flush: true)
When you call bookInstance.save(flush: true) a boolean is returned. Grails takes advantage of this by default when a controller is generated, but it appears you have changed the controller Grails generated by default for some reason.
Just replace this:
bookInstance.validate()
if(bookInstance.hasErrors()) {
render(view: "edit", model: [bookInstance: bookInstance])
} else {
bookInstance.save(flush: true)
flash.message = message(code: 'default.updated.message', args: [message(code: 'book.label', default: 'Book'), bookInstance.id])
redirect(action: "show", id: bookInstance.id)
}
With this:
if( !bookInstance.save( flush: true ) ) {
render(view: "edit", model: [bookInstance: bookInstance])
return
}

Resources