Spring boot hibernate always drop and create ALL the indexs on server startup
spring.jpa.hibernate.ddl-auto = update
Hibernate: alter table product_category_1 drop index UKkqfeccp86g07ipixmg25dnfia
Hibernate: alter table product_category_1 add constraint
UKkqfeccp86g07ipixmg25dnfia unique (org_id, pr_ty_id, name)
Hibernate: alter table product_category_2 drop index UKqa7n4ip0gfa4qpg034ba7bkob
Hibernate: alter table product_category_2 add constraint UKqa7n4ip0gfa4qpg034ba7bkob unique (org_id, pr_ca1_id, name)
If your column type is a longtext, the index is not created so hibernate tries to recreate it.
I was experiencing the same thing, where starting my application resulted in my unique constraints being dropped and re-added:
Hibernate: alter table category drop constraint if exists UK_CATEGORY_PARENT_NAME
Hibernate: alter table category add constraint UK_CATEGORY_PARENT_NAME unique (parent_id, name)
After much internet digging and debugging, I found simply adding the following to my application properties no longer caused the constraints to be dropped:
spring.jpa.properties.hibernate.schema_update.unique_constraint_strategy=RECREATE_QUIETLY
As i observe that few of the unique key drop and create again and again with the property
spring.jpa.hibernate.ddl-auto = update
You have to change the uniqueConstraints write inside the #Table annotation and put the uniqness check at column level
Execute the drop and create unique index again and again every time you restart the project
#Table(name = "XXXX",
uniqueConstraints = {#UniqueConstraint(columnNames = { "tempUserId"}) },
)
Resolve by adding the unique=true and column level
#Column(unique = true)
private Long tempUserId;
And delete uniqueConstraints from the #Table annotation
This will resolve the problem
Related
I have class like Clazz
#Table(
name="tablename",
uniqueConstraints=
#UniqueConstraint(
name= "uniqueColumn_deleted_uk",
columnNames={"myuniquecolumn", "deleted"}
)
)
public class Clazz {
#Column(name = "deleted")
private LocalDateTime deleted;
}
deleted is nullable, PosgreSQL creates unique index like
CREATE UNIQUE INDEX uniqueColumn_date_uk ON public.tablename (short_code_3, deleted);
and it allows insert duplicate myuniquecolumn when deleted is NULL.
How prevent this?
I want have non duplicates when deleted is null.
You should create two partial unique indexes
create unique index on public.tablename (short_code_3, deleted) where deleted is not null;
create unique index on public.tablename (short_code_3) where deleted is null;
(I don't know how to do it in your ORM).
This is not possible because null is never = null.
Read more about null values in SQL https://en.wikipedia.org/wiki/Null_(SQL)
If you want to have the deleted column in the unique index you must provide a default value for that column.
Tow partial indexes like klin provided are best practice up to Postgres 14.
Postgres 15 adds NULLS NOT DISTINCT for this purpose:
CREATE UNIQUE INDEX foo_idx ON public.tbl (short_code_3, deleted) NULLS NOT DISTINCT;
See:
Create unique constraint with null columns
I am trying to use optimistic locking.
I am adding the version column to my table how do I set the default value to the version column for existing data or this is sufficient on entity?
#Version
#Column(name = "VERSION")
private Long version = 0L;
The most easiest way it to do this in the database.
Of course you need to add the version column anyway: something like:
alter table MyEntity add column version INT(11); //no not null constraint here!
and then just add the first value to all entities:
update MyEntity set 'version' = 1;
now you can also add the not null constraint
alter table MyEntity modify version INT(11) NOT NULL;
(I expect that you stop the application while you add the version column).
In case of Oracle as a database - use with values option for nullable columns
alter table MyEntity add column version INT(11) default 0 with values
for not-null columns - DB will updates to default value for existing rows
alter table MyEntity add column version INT(11) not null default 0
From Oracle-11g onwards, default values are retrieved from metadata
for null values on modified field, Oracle does not perform update on each row to fill default values.
see - https://chandlerdba.com/2014/10/30/adding-not-null-columns-with-default-values/
I have a Spring server with use JPA and H2 to store data.
A database table is created using this class:
#Entity
public class parametros {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id;
long idproject;
#ElementCollection(fetch=FetchType.EAGER)
List<String> answers = new ArrayList<String>();
public parametros(){
}
.../* gets and sets */
}
So, JPA and H2 creates automatically a database within a Table="parametros" and another Table="parametros_answers" like this:
Table = "parametros"
ID | IDPROJECT
1 | 3
2 | 6
Table="parametros_answers"
PARAMETROS_ID | ANSWERS
1 | Masculine
1 | Female
1 | Other
2 | Cocacola
2 | Pepsi
So, the system is creating a foreign key in the Table = "parametros_answer" .
Until this point all is OK. The problem comes when I try to EDIT a column value from Table = "parametros". The only way I know to change a column value is using UPDATE SET statement. So when I try to do:
UPDATE PARAMETROS SET IDPROJECT=10 WHERE IDPROJECT=3 this Error Appears:
Error "DELETE FROM PUBLIC.PARAMETROS WHERE ID=? AND IDPROJECT=? , cause: "org.h2.jdbc.JdbcSQLException: Referential integrity constraint violation: ""FK_NFSJ58KBBAYJ84HFJLOCV9JEQ: PUBLIC.PARAMETROS_ANSWERS FOREIGN KEY(PARAMETROS_ID) REFERENCES PUBLIC.PARAMETROS(ID) (1)""; SQL statement:
DELETE FROM PUBLIC.PARAMETROS WHERE ID=? AND EXTRA_CONF=? AND IDPROJECT=? AND MULTI_MAXIMUM=? AND MULTI_MINIMUM=? AND NOTE=? AND PERGUNTA=? AND TITLE=? AND TYPE=? [23503-187]"; SQL statement:
The exception says what is the problem: you are violating the referential integrity.
All 'parametros_answers' rows have to point to a 'parametros'. You cannot delete or update 'parametros' without deleting or updating the connections first from 'parametros_answers' in the same transaction.
Or remove the foreign key constraint.
Or use entitymanager which automatically deletes/updates these rows if you mark the collection accordingly: #Cascade(value={CascadeType.ALL})
So, I don't know if this solution is the best but is the only one that works for me. Maybe the #highstakes solution will work when the DataBase is empty but in my case is not.
Steps i followed:
1 - Copy the parametros_answers table (the one which has the foreig key) like this:
CREATE TABLE PARAMETROS_ANSWERS_AUX AS SELECT * FROM PARAMETROS_ANSWERS
2 - Delete the parametros_answers table:
DROP TABLE PARAMETROS_ANSWER
2 - After that, I can do the changes I need on parametros table:
UPDATE PARAMETROS SET IDPROJECT=10 WHERE IDPROJECT=3
3 - I create again the parametros_answers by copying that from the parametros_answers_aux table:
CREATE TABLE PARAMETROS_ANSWERS AS SELECT * FROM PARAMETROS_ANSWERS_AUX
4 - Finally I can delete the auxiliar table parametros_answers_aux:
DROP TABLE PARAMETROS_ANSWERS_AUX
I am working with Spring 3 and Mybatis 3.
Everything is working ok just when i want to make a cascade delete.
Ive got 2 tables with a middle M-M relationship table. Something like Table1 ---> MiddleTable ---> Table2
I want to make a deletion from the midle table and after that delete de data related in the Table2.
In using a Transactional method
#Transactional
public void relacionaReservaLibreBonoLibre(ParametrosRelacionReservaBono params) throws Exception{
ReservaBean r=rm.buscarReservaPorPK(params.getReserva());
for(BonoJson b:params.getListaBonosAdd()){
HotelBean h=hm.buscaHotelPorCodHotel(b.getHotel());
EstacionBean e=em.buscaEstacionPorEstacionYHotel(b.getEstacion(),h.getCnHotel());
DocumentoBean db=new DocumentoBean();
db.setCnEstacion(e.getCnEstacion());
db.setCnHotel(h.getCnHotel());
db.setCnTipDoc(r.getCnTipoDoc());
db.setFlLibre(true);
db.setTeDoc(b.getCodBono());
Integer docId=dm.insertaDocumento(db);
DocumentoReservaBean drb=new DocumentoReservaBean();
drb.setCnDoc(docId);
drb.setCnReserva(r.getCnReserva());
drm.insertaDocumentoReserva(drb);
}
for(BonoJson b:params.getListaBonosQuit()){
HotelBean h=hm.buscaHotelPorCodHotel(b.getHotel());
EstacionBean e=em.buscaEstacionPorEstacionYHotel(b.getEstacion(),h.getCnHotel());
ReservaDocumentoReservaBean filtro=new ReservaDocumentoReservaBean();
filtro.setTeDoc(b.getCodBono());
filtro.setCnReserva(r.getCnReserva());
filtro.setFlLibre(true);
List<ReservaDocumentoReservaBean> resPrev=rdm.getReservaDocumentos(filtro);
for(ReservaDocumentoReservaBean resPart:resPrev){
DocumentoReservaBean drb=new DocumentoReservaBean();
drb.setCnDocReserva(resPart.getCnDocReserva());
drm.eliminaDocumentoReservaPorPK(drb);
DocumentoBean db=new DocumentoBean();
db.setCnDoc(resPart.getCnDoc());
dm.eliminaDocumentoPorPK(db);
}
}
}
It works great just when is executes de
dm.eliminaDocumentoPorPK(db);
It launches the Constraint violation Table2 to Middle table, that its suposed to be deleted in
drm.eliminaDocumentoReservaPorPK(drb);
¿Any hint?
Thanks in advance.
There are several options:
Delete from Table2 and then delete from MiddleTable
If this is acceptable (that is MiddleTable entity owns Table2 entity) then change foreign key in database so that rows in Table2 were deleted by cascade when row in MiddleTable is deleted. Just add ON CASCADE DELETE to foreign key from Table2 to MiddleTable definition.
Make foreign key constrain deferred if your database supports this.
I'm using Hibernate Tools 3.2.1.GA with the Spring version 3.0.2. I'm trying to retrieve the id of the last inserted row into the Oracle(10g) database as follows.
Session session=NewHibernateUtil.getSessionFactory().getCurrentSession();
session.beginTransaction();
Country c=new Country();
c.setCountryId(new BigDecimal(0));
c.setCountryName(request.getParameter("txtCountryName"));
c.setCountryCode(request.getParameter("txtCountryCode"));
Zone z=(Zone) session.get(Zone.class, new BigDecimal(request.getParameter("zoneId")));
c.setZone(z);
session.save(c);
session.flush();
System.out.println(c.getCountryId());
session.getTransaction().commit();
This statement System.out.println(c.getCountryId()); is expected to return the currently inserted id after the data is serialized to the database and before the transaction is committed but it doesn't because of the following line in the preceding code snippet (as it presumably appears to me).
c.setCountryId(new BigDecimal(0));
I'm not sure why this statement is required in my case (while inserting). I saw this statement nowhere. Omission of this line causes the following exception to be thrown.
org.hibernate.id.IdentifierGenerationException: ids for this class
must be manually assigned before calling save(): model.Country
Is this statement c.setCountryId(new BigDecimal(0)); really required during insertion? It's a sequence generated primary key in the Oracle database and because of that line, this statement System.out.println(c.getCountryId()); always returns 0 which is actually expected to return the currently inserted id in the current session.
So, how can I get the last generated id in this case? Am I following a wrong way, is there a different way?
EDIT:
CREATE TABLE "COUNTRY"
(
"COUNTRY_ID" NUMBER(35,0) NOT NULL ENABLE,
"COUNTRY_CODE" VARCHAR2(10),
"COUNTRY_NAME" VARCHAR2(50),
"ZONE_ID" NUMBER(35,0),
CONSTRAINT "COUNTRY_PK" PRIMARY KEY ("COUNTRY_ID") ENABLE,
CONSTRAINT "COUNTRY_FK" FOREIGN KEY ("ZONE_ID")
REFERENCES "ZONE" ("ZONE_ID") ON DELETE CASCADE ENABLE
)
/
CREATE OR REPLACE TRIGGER "BI_COUNTRY"
before insert on "COUNTRY"
for each row
begin
select "COUNTRY_SEQ".nextval into :NEW.COUNTRY_ID from dual;
end;
/
ALTER TRIGGER "BI_COUNTRY" ENABLE
/
The exception 'ids for this class must be manually assigned before calling save()' means that you are using the identifier generation strategy of 'Assigned'.
assigned
lets the application assign an identifier to the object before save() is called. This is the default strategy if no element is specified.
If you do not define any strategy, hibernate defaults to 'assigned'. 'assigned' strategy implies that hibernate expects that the application supplies it's own ids.
If you want to use a sequence id generator in Oracle, you can do so with the following configuration -
If you are using xml -
<id name="countryId" type="java.lang.Integer">
<column name="Country_Id" />
<generator class="sequence">
<param name="sequence">Country_Id_Seq</param>
</generator>
</id>
If you are using annotations -
#Id
#GeneratedValue(strategy=GenerationType.SEQUENCE, generator="Country_Id_Seq")
#SequenceGenerator(name="Country_Id_Seq", sequenceName="Country_Id_Seq" )
private Integer sequence;
And your code should look like so -
Country c=new Country();
c.setCountryName(request.getParameter("txtCountryName"));
c.setCountryCode(request.getParameter("txtCountryCode"));
Zone z=(Zone) session.get(Zone.class, new BigDecimal(request.getParameter("zoneId")));
c.setZone(z);
session.save(c);
session.flush();
System.out.println(c.getCountryId());
When 'session.save(c)' executes, hibernate makes the following sql call to Oracle, retrieves the id and sets it in Country object.
select Country_Id_Seq.nextVal from dual;
Problem with trigger
Since you are using a trigger to increment the id when a row is inserted, this will cause a problem with hibernate sequence. Hibernate is using the sequence to generate an id and the database is using the trigger to increment the id. This is resulting in the id being incremented twice.
You have a three options to resolve this.
Delete the trigger because it's not necessary.
If you still need the trigger because the table could be updated outside the application, you could update the trigger such that the id is generated only if the id is not set in the insert statement
HIbernate issue with Oracle Trigger for generating id from a sequence
Create a custom id generator that uses the trigger to set the id in the data before it is saved to db. Check out the following link - https://forum.hibernate.org/viewtopic.php?t=973262
If the values into an ID column generated by a sequence, then you should associate that sequence with your ID column in the entity definition so that the attribute is filled in with the ID value by Hibernate during insertion.
Using annotations:
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "CountryIdSequence")
#SequenceGenerator(name = "CountryIdSequence", sequenceName = "COUNTRY_ID_SEQUENCE")
#Column(name = "COUNTRY_ID")
private BigDecimal countryId;
Using hbm:
<id name="countryId" type="big_decimal">
<column name="COUNTRY_ID" />
<generator class=""sequence">
<param name="sequence">COUNTRY_ID_SEQUENCE</param>
</generator>
</id>
Then, it will be available after the save.
Any changes made to the entity at the database layer are not reflected in the hibernate entity layer until you refresh the object.
session.save(c);
session.flush();
// Refresh the object for columns modified in the DB by IDENTITY / SEQUENCE / Triggers.
session.refresh(c);
System.out.println(c.getCountryId());