org.hibernate.exception.ConstraintViolationException only during unit test, Cascade Delete not working - spring

I have a Spring Kotlin project where I use JPA (Hibernate?) entities to map my database tables.
A Project is linked to multiple ProjectMaps. When a Project is deleted, I want all the ProjectMaps to be deleted also.
The DDL for my local H2 db is as follows.
CREATE TABLE IF NOT EXISTS "project" (
id INT AUTO_INCREMENT PRIMARY KEY,
other fields...
);
CREATE TABLE IF NOT EXISTS "project_map" (
project_id INT NOT NULL,
tag_id INT NOT NULL,
CONSTRAINT pk_project_id_tag_id PRIMARY KEY (project_id, tag_id),
CONSTRAINT fk_project_map_project_id
FOREIGN KEY (project_id) REFERENCES "project" (id)
ON DELETE CASCADE
ON UPDATE CASCADE,
CONSTRAINT fk_project_map_tag_id
FOREIGN KEY (tag_id) REFERENCES "tag" (id)
ON DELETE CASCADE
ON UPDATE CASCADE
);
The entities are as follows.
Project
#Entity
#Table(name = "project")
class Project(
other fields...,
#JsonIgnore
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
var id: Long = 0,
) {
#JsonIgnore
#OneToMany(mappedBy = "projectMapId.project", cascade = [CascadeType.ALL], orphanRemoval = true)
val projectMaps = mutableSetOf<ProjectMap>()
}
ProjectMap
#Entity
#Table(name = "project_map")
class ProjectMap(
#EmbeddedId
var projectMapId: ProjectMapId,
) {
override fun equals(other: Any?): Boolean {
if (this === other) return true
if (other !is ProjectMap) return false
if (projectMapId != other.projectMapId) return false
return true
}
override fun hashCode(): Int {
return projectMapId.hashCode()
}
}
#Embeddable
class ProjectMapId(
#ManyToOne
#JoinColumn(name = "project_id", nullable = false, referencedColumnName = "id")
var project: Project,
#ManyToOne
#JoinColumn(name = "tag_id", nullable = false, referencedColumnName = "id")
var tag: Tag,
) : Serializable {
override fun equals(other: Any?): Boolean {
if (this === other) return true
if (other !is ProjectMapId) return false
if (project != other.project) return false
if (tag != other.tag) return false
return true
}
override fun hashCode(): Int {
var result = project.hashCode()
result = 31 * result + tag.hashCode()
return result
}
}
I have a ProjectRepository interface that extends JpaRepository. In my code somewhere I call projectRepo.deleteAllInBatch(projectsToBeDeleted), and when I run it I see that the ProjectMaps are indeed deleted together with the Projects. But when I write a unit test for that method, I get this error.
[main] ERROR org.hibernate.engine.jdbc.spi.SqlExceptionHelper.logExceptions - Referential integrity constraint violation: "FKM2O7A1Y4HFNXYHQEYBQI9AABY: PUBLIC.PROJECT_MAP FOREIGN KEY(PROJECT_ID) REFERENCES PUBLIC.PROJECT(ID) (CAST(1 AS BIGINT))"; SQL statement:
delete from project where id=? [23503-210]
[main] ERROR com.dbs.localisemanagement.exception.RestExceptionHandler.handleAllUncaughtException - [L9y5j][500][1-Unknown] could not execute statement; SQL [n/a]; constraint ["FKM2O7A1Y4HFNXYHQEYBQI9AABY: PUBLIC.PROJECT_MAP FOREIGN KEY(PROJECT_ID) REFERENCES PUBLIC.PROJECT(ID) (CAST(1 AS BIGINT))"; SQL statement:
delete from project where id=? [23503-210]]; nested exception is org.hibernate.exception.ConstraintViolationException: could not execute statement
org.springframework.dao.DataIntegrityViolationException: could not execute statement; SQL [n/a]; constraint ["FKM2O7A1Y4HFNXYHQEYBQI9AABY: PUBLIC.PROJECT_MAP FOREIGN KEY(PROJECT_ID) REFERENCES PUBLIC.PROJECT(ID) (CAST(1 AS BIGINT))"; SQL statement:
delete from project where id=? [23503-210]]; nested exception is org.hibernate.exception.ConstraintViolationException: could not execute statement
The error is at this line projectRepo.deleteAllInBatch(projectsToBeDeleted), so I see that it is because there are still existing ProjectMaps that's why the Projects cannot be deleted.
I reckon it is because the Cascade Delete is not working during unit test, but I don't know why. Anyone has any idea on this?

Related

Get list of database errors on saveAll()

I have a CountryObject:
#Entity
#Table(name = "country")
class CountryObject(
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id", nullable = false)
val id: Long? = null,
#Column(name = "name")
val name: String,
#Column(name = "ISO2", length = 2)
val ISO2: String,
#Column(name = "ISO3", length = 3)
val ISO3: String
) : Serializable
with the corresponding PostgreSQL create script that has some unique constraints on ISO2 and ISO3:
CREATE TABLE country
(
id BIGINT NOT NULL,
name VARCHAR(255) NOT NULL,
iso2 VARCHAR(2) NOT NULL,
iso3 VARCHAR(3) NOT NULL,
CONSTRAINT pk_country PRIMARY KEY (id)
);
ALTER TABLE country
ADD CONSTRAINT UNIQUE_CONSTRAINT_ISO2 UNIQUE (iso2);
ALTER TABLE country
ADD CONSTRAINT UNIQUE_CONSTRAINT_ISO3 UNIQUE (iso3);
Along with this I have a corresponding repository:
#Repository
interface CountryRepository : JpaRepository<CountryObject, Long>
And out of the box implementation of the CountryRepository has a saveAll() method that tries to insert a list of objects into the country table. So far, everything behaves as expected.
Once I try to insert a list that have multiple objects violating the unique constraints, the exception is thrown on the first and stopped. What I want to have is to get some list with the errors that I can use as a report, which mentions which entries failed saving. Of course nothing should be saved in that case and the transaction is rolled back.
Any hints how something like this can me achieved? Of course one option would be to try to save each object separately and then collect the errors, but that might be not that performant.

Strange validation conflict in Spring JPA TableGenerator

I have a legacy database with composite primary key in table project. (BaseEntity contains common properties for lastModifiedDate and lastModifiedBy)
#Entity
#IdClass(ProjectPk.class)
public class Project extends BaseEntity {
#Id
#GeneratedValue(strategy=GenerationType.TABLE, generator="nextProjectId")
#TableGenerator(
name="nextProjectId",
table="projectId",
pkColumnName = "proj_Id",
pkColumnValue="proj_id"
)
private Long projId;
#Id
private int version;
//other properties, getters and setters omitted for clarity
}
PK class
public class ProjectPk implements java.io.Serializable {
private int projId;
private int version;
//both constructoirs, equals, hashcode, getters and setters omitted for clarity
}
I have flyway migration files to simulate production database.
drop table if exists project;
CREATE TABLE project
(
proj_id bigint,
version int,
-- other columns omitted for clarity
PRIMARY KEY (`proj_id`, `version`)
) ENGINE=InnoDB;
drop table if exists project_id;
CREATE TABLE project_id
(
proj_id bigint
) ENGINE=InnoDB;
flyway creates tables as ordered in migration file
Table: project_id
Columns:
proj_id bigint
...
Table: project
Columns:
proj_id bigint PK
version int PK
...
during maven build I'm getting validation error
Schema-validation: wrong column type encountered in column [proj_id] in table [project_id]; found [bigint (Types#BIGINT)], but expecting [varchar(255) (Types#VARCHAR)]
What I did wrong to make hibernate expect [varchar(255) (Types#VARCHAR)]?
This is SpringBoot project 2.6.6 with MySql database
I see the following problems with your code:
Type mismatch between Project.projId (Long type) and ProjectPk.projId (int type).
You use wrong table structure for the project_id table.
You can see a working example below.
Assuming that you have the following tables:
CREATE TABLE test_project
(
proj_id bigint,
version int,
title VARCHAR(50),
PRIMARY KEY (proj_id, version)
);
create table table_identifier (
table_name varchar(255) not null,
product_id bigint,
primary key (table_name)
);
insert into table_identifier values ('test_project', 20);
and the following mapping:
#Entity
#Table(name = "test_project")
#IdClass(ProjectPk.class)
public class Project {
#Id
#GeneratedValue(strategy = GenerationType.TABLE, generator = "nextProjectId")
#TableGenerator(
name="nextProjectId",
table="table_identifier",
pkColumnName = "table_name",
valueColumnName="product_id",
allocationSize = 5
)
#Column(name = "proj_id")
private Long projId;
#Id
private int version;
// other fields, getters, setters ...
}
you will be able to persist the entity like below:
Project project = new Project();
project.setVersion(1);
// ...
entityManager.persist(project);

Spring Boot - relationship deleted on save() method

My problem is this: There is a many to many relationship between two tables - Project and Employee. There is an option to update a given employee, but there is a little problem. After updating the employee, hibernate automatically deletes the employee's record from the connected project_employee table.
Hibernate: update employee set email=?, first_name=?, last_name=? where employee_id=?
And this happens right after that
Hibernate: delete from project_employee where employee_id=?
I'm following a course and I've just noticed this error. Source code of the lecturer is here:
https://github.com/imtiazahmad007/spring-framework-course
I've checked your github page:
#ManyToMany(cascade = {CascadeType.DETACH, CascadeType.MERGE, CascadeType.REFRESH, CascadeType.PERSIST},
fetch = FetchType.LAZY)
#JoinTable(name="project_employee",
joinColumns=#JoinColumn(name="employee_id"),
inverseJoinColumns= #JoinColumn(name="project_id")
)
#JsonIgnore
private List<Project> projects;
CascadeType.MERGE + CascadeType.PERSIST mean, that if Employee entity is saved, Project entity references must be saved.
In may-to-many cases it means:
DELETE by foreign key
Bulk insert
In case there's no bulk insert, there's an issue with persisntence context (your are saving an entity with empty collection of projects).
Possible solutions:
Remove CascadeType.MERGE + CascadeType.PERSIST if you do not want to change projects every time your save Employee. You can still save the ccollection via Repository
Make sure collection is attached on save action. That will cause Delete+Insert, but the resut will be ok.
Change Many-To-Many to One-To-Many with EmbeddedId
Please, refer to documentation:
When an entity is removed from the #ManyToMany collection, Hibernate simply deletes the joining record in the link table. Unfortunately, this operation requires removing all entries associated with a given parent and recreating the ones that are listed in the current running persistent context.
https://docs.jboss.org/hibernate/orm/5.6/userguide/html_single/Hibernate_User_Guide.html#associations-many-to-many
*** Update from dialog below to make cascade clear.
Say, you have two entities A & B (getters and setters omitted). + repos
#Entity
#Table(name = "a")
public class A {
#Id
private Integer id;
private String name;
#ManyToMany(cascade = {CascadeType.ALL})
#JoinTable(name="a_b",
joinColumns=#JoinColumn(name="a_id"),
inverseJoinColumns= #JoinColumn(name="b_id")
)
private List<B> bs;
}
#Entity
#Table(name = "b")
public class B {
#Id
private Integer id;
private String name;
}
You sample test looks like this:
#Test
public void testSave() {
B b = new B();
b.setId(1);
b.setName("b");
b = bRepository.save(b);
A a = new A();
a.setId(1);
a.setName("a");
a.setBs(Collections.singletonList(b));
aRepository.save(a);
a.setName("new");
service.save(a); //watch sevice implementations below
}
Version1:
#Transactional
public void save(A a) {
aRepository.save(a);
}
Hibernate logs are the following:
Hibernate:
update
a
set
name=?
where
id=?
Hibernate:
delete
from
a_b
where
a_id=?
Hibernate:
insert
into
a_b
(a_id, b_id)
values
(?, ?)
delete+bulk insert present (despite the fact, that B-s where not in fact changed)
Version2:
#Transactional
public void save(A a) {
Optional<A> existing = aRepository.findById(a.getId());
if (existing.isPresent()) {
a.setBs(existing.get().getBs());
}
aRepository.save(a);
}
Logs:
update
a
set
name=?
where
id=?
Here b-collection was forcibly re-attached, so hibernate understands, that it's not needed to be cascaded.

Hibernate performs update and delete on custom JPQL

I am trying to update the fields of an entity that has a ManyToMany relationship, however, as I just want to update the table fields and ignore the ManyToMany relationship. The relationship is between the Company and UserSystem entities, it was defined in the relationship that company_user_system is the union table of the entities. The problem is that when executing my update in Company, always before my update, Hibernate makes an update in company and the relationship delete in user_system_company and this erases the relationship between Company and UserSystem and I don't understand why these two queries occur if I don't execut.
These are the queries, the first and second are not executed by my code:
Hibernate: update company set active=?, email=?, identification_code=?, trading_name=?, update_on=? where id=?
Hibernate: delete from company_user_system where company_id=?
Hibernate: update company set email=?, phone=?, corporate_name=?, trading_name=?, identification_code=?, email=?, phone2=? where id=?
Hibernate: select company0_.id as id1_0_, company0_.active as active2_0_, company0_.corporate_name as corporat3_0_, company0_.created_on as created_4_0_, company0_.email as email5_0_, company0_.email2 as email6_0_, company0_.identification_code as identifi7_0_, company0_.phone as phone8_0_, company0_.phone2 as phone9_0_, company0_.trading_name as trading10_0_, company0_.update_on as update_11_0_ from company company0_ where company0_.id=?
Following is the update implementation code:
public class CompanyRepositoryImpl implements CompanyRepositoryCustom {
#PersistenceContext
private EntityManager entityManager;
public Company updateCompanyFields(Company company) {
// ... fieldSql implementation omitted
String sql = "UPDATE Company SET "+ fieldsSql +" WHERE id = :id ";
Query query = entityManager.createQuery(sql);
// set the values for the fields
for (Method method : getMethods) {
query.setParameter(lowercaseFirstCharInString(cutGetInMethods(method.getName())), method.invoke(company));
}
// set id
query.setParameter("id", company.getId());
// execute update and search the database to return the updated object
if (query.executeUpdate() == 1) {
query = entityManager.createQuery("SELECT c FROM Company c WHERE c.id = :id");
query.setParameter("id", company.getId());
Company getCompany = (Company) query.getResultList().get(0);
return getCompany;
}
return null;
}
// ... Other methods omitted
}
Repository Code:
#Repository
public interface CompanyRepository extends JpaRepository<Company, Long>, JpaSpecificationExecutor<Company> , CompanyRepositoryCustom {
#Modifying
Company updateCompanyFields(Company company);
}
Company entity code, I just added the attributes that I think may contain something useful to try to solve the problem:
#Entity
#DynamicUpdate
#Table(name = "company")
public class Company implements Serializable {
#CreationTimestamp
#Column(name = "created_on", nullable = false)
private Instant createdOn;
#UpdateTimestamp
#Column(name = "update_on")
private Instant updateOn;
#ManyToMany
#JoinTable(
name = "company_user_system",
joinColumns = #JoinColumn(
name = "company_id", referencedColumnName = "id"
),
inverseJoinColumns = #JoinColumn(
name = "user_system_id", referencedColumnName = "id"
)
)
private Set<UserSystem> userSystems = new HashSet<>();
}
The UserSystem class defines the relationship as follows:
#ManyToMany(mappedBy = "userSystems")
private Set<Company> companies = new HashSet<>();
What may be causing this update and delete before my update?
This happens because you changed somewhere the value(s) of your relationship. EntityManager tracks such changes and marks the entity as dirty. When you execute a custom SQL query Hibernate will perform all the pending queries (submit any dirty entities).
You may prevent it by calling EntityManager.clear().

JPA not returning the records which contain empty columns in the EmbeddedId

JpaRepository's findAll() method does not return the rows, if any of the field in the composite key is null.
This is the entity class with the EmbeddedId JobVaccinationPK
/**
* ApplicationParam entity. #author MyEclipse Persistence Tools
*/
#Entity
#Table(name="job_vaccination",schema="cdcis")
#SuppressWarnings("serial")
public class JobVaccination implements java.io.Serializable {
// Fields
#Column(name="default_yn", length=1)
private String defaultYn;
#EmbeddedId
private JobVaccinationPK jobVaccinationPK;
public JobVaccination(){
}
//setters getters
}
This is the Embedded class
#Embeddable
#SuppressWarnings("serial")
public class JobVaccinationPK implements Serializable{
#ManyToOne
#MapsId("job_category_id")
#JoinColumn(name = "job_category_id", nullable=true)
private JobCategoryTypeMast jobCategoryMast;
#ManyToOne
#MapsId("vaccination_id")
#JoinColumn(name = "vaccination_id", nullable=true)
private VaccinationMast vaccinationMast;
#ManyToOne
#MapsId("screening_type_id")
#JoinColumn(name = "screening_type_id", nullable=true)
private ScreeningTypeMast screeningTypeMast;
//getters and setters
}
Service implementation class
#Override
public SearchResult<JobVaccinationDto> getJobVaccination(JobVaccinationDto dto)
throws VaccinationException {
List<JobVaccination> vaccDetails = jobVaccinationRepo.findAll();
if(vaccDetails == null) return null;
List<JobVaccinationDto> jobVaccinationDtos = new ArrayList<JobVaccinationDto>();
jobVaccinationDtos = convertToDto(vaccDetails);
return new SearchResult<>(jobVaccinationDtos.size(), jobVaccinationDtos);
}
Here am able to insert a null value for either jobCategoryId or screeningTypeId, just like below row. But when I'm trying to fetch the rows which have empty values, it returns null. I've tried to debug but I was not able to find the cause.
This is the generated hibernate query:
Hibernate:
select
jobvaccina0_.job_category_id as job_cate4_13_,
jobvaccina0_.screening_type_id as screenin2_13_,
jobvaccina0_.vaccination_id as vaccinat3_13_,
jobvaccina0_.default_yn as default_1_13_
from
cdcis.job_vaccination jobvaccina0_
Hibernate:
select
jobcategor0_.job_category_id as job_cate1_11_0_,
jobcategor0_.job_category_name as job_cate2_11_0_,
jobcategor0_.job_category_name_ar as job_cate3_11_0_,
jobcategor0_.screening_type_id as screenin4_11_0_
from
cdcis.job_category_mast jobcategor0_
where
jobcategor0_.job_category_id=?
Hibernate:
select
screeningt0_.screening_type_id as screenin1_21_0_,
screeningt0_.active_yn as active_y2_21_0_,
screeningt0_.mmpid_required_yn as mmpid_re3_21_0_,
screeningt0_.screening_type as screenin4_21_0_
from
cdcis.screening_type_mast screeningt0_
where
screeningt0_.screening_type_id=?
Hibernate:
select
vaccinatio0_.vaccination_id as vaccinat1_27_0_,
vaccinatio0_.vaccination_name as vaccinat2_27_0_,
vaccinatio0_.vaccination_name_ar as vaccinat3_27_0_
from
cdcis.vaccination_mast vaccinatio0_
where
vaccinatio0_.vaccination_id=?
Going with #Adam Michalik answer. As a work-around I've introduced a new primary key field in the table, as we can't handle a null in the composite key.
Composite IDs cannot contain null values in any of the fields. Since the SQL semantics of NULL is that NULL <> NULL, it cannot be determined that a primary key (1, 2, NULL) is equal to (1, 2, NULL).
NULL means "no value" in SQL and its interpretation is up to you on a case-by-case basis. That's why SQL and JPA do not want to make assumptions that NULL = NULL and that a primary key containing a NULL identifies a single entity only.
You may choose to use a synthetic, generated primary key instead of the composite business primary key to overcome that. Then, you'd always have a non-null, single-column PK and nullable foreign keys.
change the data type of particular row in entity from int to integer

Resources