I have a Spring data entity (using JPA w/ Hibernate and MySQL) defined as such:
#Entity
#Table(name = "dataset")
public class Dataset {
#Id
#GenericGenerator(name = "generator", strategy = "increment")
#GeneratedValue(generator = "generator")
private Long id;
#Column(name = "name", nullable = false)
private String name;
#Column(name = "guid", nullable = false)
private String guid;
#Column(name = "size", nullable = false)
private Long size;
#Column(name = "create_time", nullable = false)
private Date createTime;
#OneToOne(optional = false)
#JoinColumn(name = "created_by")
private User createdBy;
#Column(name = "active", nullable = false)
private boolean active;
#Column(name = "orig_source", nullable = false)
private String origSource;
#Column(name = "orig_source_type", nullable = false)
private String origSourceType;
#Column(name = "orig_source_org", nullable = false)
private String origSourceOrg;
#Column(name = "uri", nullable = false)
private String uri;
#Column(name = "mimetype", nullable = false)
private String mimetype;
#Column(name = "registration_state", nullable = false)
private int registrationState;
#OneToMany(fetch = FetchType.EAGER, cascade = {CascadeType.ALL})
#JoinColumn(name = "dataset_id")
#JsonManagedReference
private List<DatasetFile> datasetFiles;
I have the following repository for this entity:
public interface DatasetRepo extends JpaRepository<Dataset, Long> {
#Query("SELECT CASE WHEN COUNT(p) > 0 THEN 'true' ELSE 'false' END FROM Dataset p WHERE p.uri = ?1 and p.registrationState>0")
public Boolean existsByURI(String location);
#Query("SELECT a FROM Dataset a LEFT JOIN FETCH a.datasetFiles c where a.registrationState>0")
public List<Dataset> getAll(Pageable pageable);
#Query("SELECT a FROM Dataset a LEFT JOIN FETCH a.datasetFiles c WHERE a.registrationState>0")
public List<Dataset> findAll();
#Query("SELECT a FROM Dataset a LEFT JOIN FETCH a.datasetFiles c where a.guid= ?1")
public Dataset findByGuid(String guid);
}
Now - In a controller, I am fetching a dataset, updating one of its attributes and I would be expecting that attribute change to be flushed to the DB, but it never is.
#RequestMapping(value = "/storeDataset", method = RequestMethod.GET)
public #ResponseBody
WebServiceReturn storeDataset(
#RequestParam(value = "dsGUID", required = true) String datasetGUID,
#RequestParam(value = "stType", required = true) String stType) {
WebServiceReturn wsr = null;
logger.info("stType: '" + stType + "'");
if (!stType.equals("MongoDB") && !stType.equals("Hive") && !stType.equals("HDFS")) {
wsr = getFatalWebServiceReturn("Invalid Storage type '" + stType + "'");
} else if (stType.equals("MongoDB")) {
/* Here is where I'm reading entity from Repository */
Dataset dataset = datasetRepo.findByGuid(datasetGUID);
if (dataset != null) {
MongoLoader mongoLoader = new MongoLoader();
boolean success = mongoLoader.loadMongoDB(dataset);
logger.info("Success: " + success);
if (success) {
/* Here is where I update entity attribute value, this is never flushed to DB */
dataset.setRegistrationState(1);
}
wsr = getWebServiceReturn(success ? 0 : -1, "Successfully loaded dataset files into " + stType + " storage", "Failed to load dataset files into " + stType + " storage");
}
}
return wsr;
}
Thank you
You need to annotate the method of request mapping with #Transactional.
Why? If you want to modify an object in memory and then it is updated transparently in the database you need do it inside an active transaction.
Don't forget you're using JPA (spring-data is using JPA) and if you want your Entity will be in a managed state you need an active transaction.
See:
http://www.objectdb.com/java/jpa/persistence/update
Transparent Update Once an entity object is retrieved from the
database (no matter which way) it can simply be modified in memory
from inside an active transaction:
Employee employee = em.find(Employee.class, 1);
em.getTransaction().begin();
employee.setNickname("Joe the Plumber");
em.getTransaction().commit();
Related
I have a spring boot project with apache camel (Using maven dependencies: camel-spring-boot-starter, camel-jpa-starter, camel-endpointdsl).
There are the following 3 entities:
#Entity
#Table(name = RawDataDelivery.TABLE_NAME)
#BatchSize(size = 10)
public class RawDataDelivery extends PersistentObjectWithCreationDate {
protected static final String TABLE_NAME = "raw_data_delivery";
private static final String COLUMN_CONFIGURATION_ID = "configuration_id";
private static final String COLUMN_SCOPED_CALCULATED = "scopes_calculated";
#Column(nullable = false, name = COLUMN_SCOPED_CALCULATED)
private boolean scopesCalculated;
#OneToMany(mappedBy = "rawDataDelivery", fetch = FetchType.LAZY)
private Set<RawDataFile> files = new HashSet<>();
#CollectionTable(name = "processed_scopes_per_delivery")
#ElementCollection(targetClass = String.class)
private Set<String> processedScopes = new HashSet<>();
// Getter/Setter
}
#Entity
#Table(name = RawDataFile.TABLE_NAME)
#BatchSize(size = 100)
public class RawDataFile extends PersistentObjectWithCreationDate {
protected static final String TABLE_NAME = "raw_data_files";
private static final String COLUMN_CONFIGURATION_ID = "configuration_id";
private static final String COLUMN_RAW_DATA_DELIVERY_ID = "raw_data_delivery_id";
private static final String COLUMN_PARENT_ID = "parent_file_id";
private static final String COLUMN_IDENTIFIER = "identifier";
private static final String COLUMN_CONTENT = "content";
private static final String COLUMN_FILE_SIZE_IN_BYTES = "file_size_in_bytes";
#ManyToOne(optional = true, fetch = FetchType.LAZY)
#JoinColumn(name = COLUMN_RAW_DATA_DELIVERY_ID)
private RawDataDelivery rawDataDelivery;
#Column(name = COLUMN_IDENTIFIER, nullable = false)
private String identifier;
#Lob
#Column(name = COLUMN_CONTENT, nullable = true)
private Blob content;
#Column(name = COLUMN_FILE_SIZE_IN_BYTES, nullable = false)
private long fileSizeInBytes;
// Getter/Setter
}
#Entity
#TypeDef(name = "jsonb", typeClass = JsonBinaryType.class)
#Table(name = RawDataRecord.TABLE_NAME, uniqueConstraints = ...)
public class RawDataRecord extends PersistentObjectWithCreationDate {
public static final String TABLE_NAME = "raw_data_records";
static final String COLUMN_RAW_DATA_FILE_ID = "raw_data_file_id";
static final String COLUMN_INDEX = "index";
static final String COLUMN_CONTENT = "content";
static final String COLUMN_HASHCODE = "hashcode";
static final String COLUMN_SCOPE = "scope";
#ManyToOne(optional = false)
#JoinColumn(name = COLUMN_RAW_DATA_FILE_ID)
private RawDataFile rawDataFile;
#Column(name = COLUMN_INDEX, nullable = false)
private long index;
#Lob
#Type(type = "jsonb")
#Column(name = COLUMN_CONTENT, nullable = false, columnDefinition = "jsonb")
private String content;
#Column(name = COLUMN_HASHCODE, nullable = false)
private String hashCode;
#Column(name = COLUMN_SCOPE, nullable = true)
private String scope;
}
What I try to do is to build a route with apache camel which selects all deliveries having the flag "scopesCalculated" == false and calculate/update the scope variable of all records attached to the files of this deliveries. This should happen in one database transaction. If all scopes are updated I want to set the scopesCalculated flag to true and commit the changes to the database (in my case postgresql).
What I have so far is this:
String r3RouteId = ...;
var dataSource3 = jpa(RawDataDelivery.class.getName())
.lockModeType(LockModeType.NONE)
.delay(60).timeUnit(TimeUnit.SECONDS)
.consumeDelete(false)
.query("select rdd from RawDataDelivery rdd where rdd.scopesCalculated is false and rdd.configuration.id = " + configuration.getId())
;
from(dataSource3)
.routeId(r3RouteId)
.routeDescription(configuration.getName())
.messageHistory()
.transacted()
.process(exchange -> {
RawDataDelivery rawDataDelivery = exchange.getIn().getBody(RawDataDelivery.class);
rawDataDelivery.setScopesCalculated(true);
})
.transform(new Expression() {
#Override
public <T> T evaluate(Exchange exchange, Class<T> type) {
RawDataDelivery rawDataDelivery = exchange.getIn().getBody(RawDataDelivery.class);
return (T)rawDataDelivery.getFiles();
}
})
.split(bodyAs(Iterator.class)).streaming()
.transform(new Expression() {
#Override
public <T> T evaluate(Exchange exchange, Class<T> type) {
RawDataFile rawDataFile = exchange.getIn().getBody(RawDataFile.class);
// rawDataRecordJpaRepository is an autowired interface by spring with the following method:
// #Lock(value = LockModeType.NONE)
// Stream<RawDataRecord> findByRawDataFile(RawDataFile rawDataFile);
// we may have many records per file (100k and more), so we don't want to keep them all in memory.
// instead we try to stream the resultset and aggregate them by 500 partitions for processing
return (T)rawDataRecordJpaRepository.findByRawDataFile(rawDataFile);
}
})
.split(bodyAs(Iterator.class)).streaming()
.aggregate(constant("all"), new GroupedBodyAggregationStrategy())
.completionSize(500)
.completionTimeout(TimeUnit.SECONDS.toMillis(5))
.process(exchange -> {
List<RawDataRecord> rawDataRecords = exchange.getIn().getBody(List.class);
for (RawDataRecord rawDataRecord : rawDataRecords) {
rawDataRecord.setScope("abc");
}
})
;
Basically this is working, but I have the problem that the records of the last partition will not be updated. In my example I have 43782 records but only 43500 are updated. 282 remain with scope == null.
I really don't understand the JPA transaction and session management of camel and I can't find some examples on how to update JPA/Hibernate entities with camel (without using SQL component).
I already tried some solutions but none of them are working. Most attempts end with "EntityManager/Session closed", "no transaction is in progress" or "Batch update failed. Expected result 1 but was 0", ...
I tried the following:
to set jpa(...).joinTransaction(false).advanced().sharedEntityManager(true)
use .enrich(jpa(RawDataRecord.class.getName()).query("select rec from RawDataRecord rec where rawDataFile = ${body}")) instead of .transform(...) with JPA repository for the records
using hibernate session from camel headers to update/save/flush entities: "Session session = exchange.getIn().getHeader(JpaConstants.ENTITY_MANAGER, Session.class);"
try to update over new jpa component at the end of the route:
.split(bodyAs(Iterator.class)).streaming()
.to(jpa(RawDataRecord.class.getName()).usePersist(false).flushOnSend(false))
Do you have any other ideas / recommendations?
Following scenario
#Entity("YEAR")
public class Year{
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
#Column(name = "NAME", nullable = false, length = 10)
public Long name;
...
}
#Entity("FOO")
public class Foo {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
#Column(name = "FK_YEAR", nullable = false)
public Long yearId;
#Column(name = "NAME", nullable = false, length = 10)
public String name;
...
}
#Entity("FII")
public class Fii {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
#Column(name = "FK_YEAR", nullable = false)
public Long yearId;
#Column(name = "CODE", nullable = false, length = 10)
public String code;
...
}
#Entity("NTOM")
public class NtoM {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
#Column(name = "FK_FOO", nullable = false)
public Long fooId;
#Column(name = "FK_FII", nullable = false)
public Long fiiId;
#Column(name = "STATE", nullable = false)
public Boolean state;
#Column(name = "VALUES", length = 500)
public String values;
...
}
Resulting in an ERP like this:
I now do have a JpaRepository like this:
#Repository
public interface NtoMRepository extends JpaRepository<NtoM, Long>, JpaSpecificationExecutor<NtoM> {
String BASE_QUERY =
"SELECT"
// prevent jpa to return null instead of id=0
+ " CASE WHEN ntom.ID IS NULL THEN 0 ELSE ntom.ID END AS ID ,"
+ " CASE WHEN ntom.STATE IS NULL THEN 0 ELSE ntom.STATE END AS STATE ,"
+ " ntom.VALUES,"
+ " fii.ID AS FK_FII,"
+ " foo.ID AS FK_FOO "
+ " FROM MYSCHEMA.FOO foo"
+ " INNER JOIN MYSCHEMA.FII fii ON fii.FK_YEAR = foo.FK_YEAR"
+ " OUTER JOIN MYSCHEMA.NTOM ntom ON ntom.FK_FII = fii.ID AND ntom.FK_FOO = foo.ID"
#Query(value = BASE_QUERY + " WHERE fii.ID = :fiiId", nativeQuery = true)
List<Option> findByFiiId(#Param("fiiId") Long fiiId);
#Query(value = BASE_QUERY + " WHERE foo.ID = :fooId", nativeQuery = true)
List<Option> findByFooId(#Param("fooId") Long fooId);
}
So the 2 queries here compute me missing entries whenever I call them, which works out quite nicely.
How could I use the "toPredicate" of the https://spring.io/blog/2011/04/26/advanced-spring-data-jpa-specifications-and-querydsl/ to accomplish this by creating a similar behavior but with dynamic parameters?
I cant just use criteriabuilder "join" as the values are only "basic attributes". Also do I want to reuse the dynamic approach as the "controller endpoint can look like"
#GetMapping
public List<NtoM> find(#RequestParam(value = "id", required = false, defaultValue = "0") Long id,#RequestParam(value = "fiiId", required = false, defaultValue = "0") Long fiiId, #RequestParam(value = "fooId", required = false, defaultValue = "0") Long fooId){
Specification<NtoM> spec = ... //build as AND construct of all parameters (if not null or empty add it)
// TODO instead of the SELECT * FROM myschema.ntom the custom query here!
return repo.findAll(spec);
}
How can I do this. I can also use the EntityManager and the criteriaBuilder.createTupleQuery(). But it seems to not work (I cant join the tables as there is no "ManyToOne" between them)
Why aren't there relationships in your domain model? It can be handled efficiently if you change your model to have relationships. Here is how I would approach this problem:
Start with creating relationships.
#Entity("YEAR")
public class Year{
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
#Column(name = "NAME", nullable = false, length = 10)
public Long name;
...
}
#Entity("FOO")
public class Foo {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
//#Column(name = "FK_YEAR", nullable = false)
//public Long yearId;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "FK_YEAR", referencedColumnName = "ID")
public Year year;
#Column(name = "NAME", nullable = false, length = 10)
public String name;
...
}
#Entity("FII")
public class Fii {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
//#Column(name = "FK_YEAR", nullable = false)
//public Long yearId;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "FK_YEAR", referencedColumnName = "ID")
public Year year;
#Column(name = "CODE", nullable = false, length = 10)
public String code;
...
}
#Entity("NTOM")
public class NtoM {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", unique = true, nullable = false)
public Long id;
//#Column(name = "FK_FOO", nullable = false)
//public Long fooId;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "FK_FOO", referencedColumnName = "ID")
public Foo foo;
//#Column(name = "FK_FII", nullable = false)
//public Long fiiId;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "FK_FII", referencedColumnName = "ID")
public Fii fii;
#Column(name = "STATE", nullable = false)
public Boolean state;
#Column(name = "VALUES", length = 500)
public String values;
...
}
Change the controller to get the request parameters into map.
Then delegate the data retrieving logic to the service layer (You can also create custom repository impelementation).
#GetMapping
public List<NtoM> find(#RequestParam Map<String, String> requestParams) {
return service.findByRequestParams(requestParams);
}
In the service class
public List<NtoM> findByRequestParams(Map<String, String> requestParams) {
return repository.findAll(createSpec(requestParams));
}
private Specification<NtoM> createSpec(Map<String, String> requstParams ) {
return (root, query, criteriaBuilder) -> {
List<Predicate> predicates = new ArrayList<>();
Join<NtoM, Fii> firstJoin = root.join("fii", JoinType.INNER);
Join<NtoM, Foo> secondJoin = fiiJoin("foo", JoinType.LEFT);
String value = requstParams.get("id");
if(StringUtils.isNotBlank(value)) {
Predicate id = criteriaBuilder.equal(secondJoin.get("id"), Long.parseLong(value));
predicates.add(id);
}
value = requestParams.get("fiiId");
if(StringUtils.isNotBlank(value)) {
Predicate fii = criteriaBuilder.equal(secondJoin.get("fii"), Long.parseLong(value));
predicates.add(fii);
}
value = requestParams.get("fooId");
if(StringUtils.isNotBlank(value)) {
Predicate foo = criteriaBuilder.equal(secondJoin.get("foo"), Long.parseLong(value));
predicates.add(foo);
}
//Later you can add new options without breaking the existing API
// For example like search by values
value = requestParams.get("values");
if(StringUtils.isNotBlank(value)) {
Predicate likeValues = criteriaBuilder.like(secondJoin.get("values"), "%" + value + "%");
predicates.add(likeValues);
}
return criteriaBuilder.and(predicates.toArray(Predicate[]::new));
};
}
In Spring Boot 2 JPA, I have the following two many to many Entities.
1- Labor:
#Entity
public class Labor {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private long id;
#Column(length = 100, nullable = false)
private String name;
#Column(length = 50)
private String mobile;
private Date dateOfBirth;
private boolean male;
private boolean active;
private String brife;
#Column(length = 500)
private String specifcation;
#ManyToMany(cascade = CascadeType.ALL)
#JoinTable(name = "labor_tag",
joinColumns = #JoinColumn(name = "labor_id"),
inverseJoinColumns = #JoinColumn(name = "tag_id"))
private Set<Tag> tags = new HashSet<>();
}
and Tag table:
#Entity
public class Tag {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private long id;
#Column(length = 100, unique = true)
private String name;
private boolean active = true;
#ManyToMany(cascade = CascadeType.ALL, mappedBy = "tags")
#JsonIgnore
private Set<Labor> labors = new HashSet<>();
}
Then I defined Labor Repository to query Labors with certain Tag ID ,gender, or ages
#Repository
public interface LaborReoistory extends JpaRepository<Labor, Long> {
#Query("select l from Labor l join l.tags t where (:tid is null or t.id in :tid) and " +
"(:isMale is null or :isMale = TRUE) and " +
"((:startYear is null or :endYear is null or :startYear > :endYear) or year(l.dateOfBirth) >= :startYear and year(l.dateOfBirth) <= :endYear)")
Page<Labor> findLaborsByCondition(#Param("tid") final long[] tid,
#Param("isMale") final Boolean isMale,
#Param("startYear") final Integer startYear,
#Param("endYear") final Integer endYear,
final Pageable pageable);
}
When I use this repository in my controller, I find the totalElements property of the Pagable returned counts to records in labor_tag(in this case 16 records),but what I actually want is to have totalElements count on Labors with given conditions. does JPA Pagable support such query or how can I find a workaround?
Thanks
After joining there will be duplicate Labor but totalElements is the count of total number of row using the query. So you should use Distinct on Labour to get the count of distinct Labour
#Query("select distinct l from Labor l join l.tags t where (:tid is null or t.id in :tid) and " +
"(:isMale is null or :isMale = TRUE) and " +
"((:startYear is null or :endYear is null or :startYear > :endYear) or year(l.dateOfBirth) >= :startYear and year(l.dateOfBirth) <= :endYear)")
I have follows ManyToMany relationship between WorkDay(has annotation ManyToMany) and Event
WorkDay entity
#Entity
#Table(name = "WORK_DAY", uniqueConstraints = { #UniqueConstraint(columnNames = { "WORKER_ID", "DAY_ID" }) })
#NamedQueries({
#NamedQuery(name = WorkDay.GET_WORK_DAYS_BY_MONTH, query = "select wt from WorkDay wt where wt.worker = :worker and to_char(wt.day.day, 'yyyyMM') = :month) order by wt.day"),
#NamedQuery(name = WorkDay.GET_WORK_DAY, query = "select wt from WorkDay wt where wt.worker = :worker and wt.day = :day") })
public class WorkDay extends SuperClass {
private static final long serialVersionUID = 1L;
public static final String GET_WORK_DAYS_BY_MONTH = "WorkTimeDAO.getWorkDaysByMonth";
public static final String GET_WORK_DAY = "WorkTimeDAO.getWorkDay";
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "WORKER_ID", nullable = false)
private Worker worker;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "DAY_ID", nullable = false)
private Day day;
#Column(name = "COMING_TIME")
#Convert(converter = LocalDateTimeAttributeConverter.class)
private LocalDateTime comingTime;
#Column(name = "OUT_TIME")
#Convert(converter = LocalDateTimeAttributeConverter.class)
private LocalDateTime outTime;
#Enumerated(EnumType.STRING)
#Column(name = "STATE", length = 16, nullable = false)
private WorkDayState state = WorkDayState.NO_WORK;
#ManyToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL)
#JoinTable(name = "WORK_DAY_EVENT", joinColumns = {
#JoinColumn(name = "WORK_DAY_ID", nullable = false)}, inverseJoinColumns = {
#JoinColumn(name = "EVENT_ID", nullable = false)})
#OrderBy(value = "startTime desc")
private List<Event> events = new ArrayList<>();
protected WorkDay() {
}
public WorkDay(Worker worker, Day day) {
this.worker = worker;
this.day = day;
this.state = WorkDayState.NO_WORK;
}
}
Event entity
#Entity
#Table(name = "EVENT")
public class Event extends SuperClass {
#Column(name = "DAY", nullable = false)
#Convert(converter = LocalDateAttributeConverter.class)
private LocalDate day;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "TYPE_ID", nullable = false)
private EventType type;
#Column(name = "TITLE", nullable = false, length = 128)
private String title;
#Column(name = "DESCRIPTION", nullable = true, length = 512)
private String description;
#Column(name = "START_TIME", nullable = false)
#Convert(converter = LocalDateTimeAttributeConverter.class)
private LocalDateTime startTime;
#Column(name = "END_TIME", nullable = true)
#Convert(converter = LocalDateTimeAttributeConverter.class)
private LocalDateTime endTime;
#Enumerated(EnumType.STRING)
#Column(name = "STATE", nullable = false, length = 16)
private EventState state;
protected Event() {
}
}
Attached UI form for clarity
When I push Clock with run icon first time, it means "create event and start work day" in bean, calling the following methods:
public void startEvent() {
stopLastActiveEvent();
Event creationEvent = new Event(workDay.getDay().getDay(), selectedEventType, selectedEventType.getTitle(),
LocalDateTime.now());
String addEventMessage = workDay.addEvent(creationEvent);
if (Objects.equals(addEventMessage, "")) {
em.persist(creationEvent);
if (workDay.isNoWork()
&& !creationEvent.getType().getCategory().equals(EventCategory.NOT_INFLUENCE_ON_WORKED_TIME)) {
startWork();
}
em.merge(workDay);
} else {
Notification.warn("Невозможно создать событие", addEventMessage);
}
cleanAfterCreation();
}
public String addEvent(Event additionEvent) {
if (!additionEvent.getType().getCategory().equals(NOT_INFLUENCE_ON_WORKED_TIME)
&& isPossibleTimeBoundaryForEvent(additionEvent.getStartTime(), additionEvent.getEndTime())) {
events.add(additionEvent);
changeTimeBy(additionEvent);
} else {
return "Пересечение временых интервалов у событий";
}
Collections.sort(events, new EventComparator());
return "";
}
private void startWork() {
workDay.setComingTime(workDay.getLastWorkEvent().getStartTime());
workDay.setState(WorkDayState.WORKING);
}
In log I see:
insert into event table
update work_day table
insert into work_day_event table
on UI updated only attached frame. Always looks fine.. current WorkDay object have one element in the events collection, also all data is inserted into DB.. but if this time edit event row
event row listener:
public void onRowEdit(RowEditEvent event) {
Event editableEvent = (Event) event.getObject();
LocalDateTime startTime = fixDate(editableEvent.getStartTime(), editableEvent.getDay());
LocalDateTime endTime = fixDate(editableEvent.getEndTime(), editableEvent.getDay());
if (editableEvent.getState().equals(END) && startTime.isAfter(endTime)) {
Notification.warn("Невозможно сохранить изменения", "Время окончания события больше времени начала");
refreshEvent(editableEvent);
return;
}
if (workDay.isPossibleTimeBoundaryForEvent(startTime, endTime)) {
editableEvent.setStartTime(startTime);
editableEvent.setEndTime(endTime);
workDay.changeTimeBy(editableEvent);
em.merge(workDay);
em.merge(editableEvent);
} else {
refreshEvent(editableEvent);
Notification.warn("Невозможно сохранить изменения", "Пересечение временых интервалов у событий");
}
}
to the work_day_event insert new row with same work_day_id and event_id data. And if edit row else do one more insert and etc.. In the result I have several equals rows in work_day_event table. Why does this happen?
link to github project repository(look ver-1.1.0-many-to-many-problem branch)
Change CascadeType.ALL to CascadeType.MERGE for events in the WorkDay entity
Use this code
#ManyToMany(fetch = FetchType.LAZY, cascade = CascadeType.MERGE)
instead of
#ManyToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL)
Do not use ArrayList, use HashSet. Because ArrayList allows duplicates.
For more info about CasecadeType, follow the tutorial:
Hibernate JPA Cascade Types
Cascading best practices
I think the simple solution is to remove the cascade on many to many relationship and do the job manually ! . I see you already doing it redundantly anyway . So try removing you CascadeType.ALL
#ManyToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL)
How to persist #ManyToMany relation - duplicate entry or detached entity
For background:
I have built a module that captures a list of a historical events that occur against an asset over its life and using JPA specifications using spring-data-jpa with hibernate to run the dynamic query using the JPA SpecificationExecutor interface. I have the following historical event JPA object with a many to one asset this historical event is directly against and other associated assets this historical event is also associated with defined in a many-to-many relationship. I am trying to write a JPA Specification predicate that pulls all historical events for a given asset that the asset is either directly against or associated too by using the includeAssociations flag in the predicate. When I try to execute the predicate I am not getting the correct results when I have the includeAssociations flag set to true. I would expect it would by default return at a minimum all the historical events they are directly as if the includeAssociations was false plus any ones they are indirectly associated with. I need help figuring out why this predicate is not returning back what I would expect. Any help is much appreciated!
Here is my Historical Event JPA object:
#Entity
#Table(name = "LC_HIST_EVENT_TAB")
public class HistoricalEvent extends BaseEntity implements Comparable<HistoricalEvent>, Serializable
{
private static final long serialVersionUID = 1L;
#ManyToOne(targetEntity = Asset.class, cascade = CascadeType.ALL, fetch = FetchType.EAGER)
#JoinColumn(nullable = false, name = "ASSET_ID")
private Asset asset;
#ManyToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL, targetEntity = Asset.class)
#JoinTable(name = "LC_HIST_EVENT_ASSETS", joinColumns =
{
#JoinColumn(name = "HIST_EVENT_ID", referencedColumnName = "id")
}, inverseJoinColumns =
{
#JoinColumn(name = "ASSET_ID", referencedColumnName = "id")
}, uniqueConstraints =
{
#UniqueConstraint(columnNames =
{
"HIST_EVENT_ID", "ASSET_ID"
})
})
#BatchSize(size=10)
#OrderBy("partCatalogItem.partID, serialNumber ASC")
private Set<Asset> associatedAssets;
#Column(name = "START_DATE", nullable = true)
#Temporal(value = TemporalType.TIMESTAMP)
private Calendar startDate;
#Column(name = "END_DATE", nullable = true)
#Temporal(value = TemporalType.TIMESTAMP)
private Calendar endDate;
}
JPA Metamodel for Historical Event:
#StaticMetamodel(HistoricalEvent.class)
public class HistoricalEvent_ extends BaseEntity_
{
public static volatile SingularAttribute<HistoricalEvent, Asset> asset;
public static volatile SetAttribute<HistoricalEvent, Asset> associatedAssets;
public static volatile SingularAttribute<HistoricalEvent, Calendar> startDate;
public static volatile SingularAttribute<HistoricalEvent, Calendar> endDate;
public static volatile SingularAttribute<HistoricalEvent, String> type;
public static volatile SingularAttribute<HistoricalEvent, String> description;
public static volatile SingularAttribute<HistoricalEvent, HistoricalEvent> triggeringEvent;
public static volatile SetAttribute<HistoricalEvent, HistoricalEvent> associatedEvents;
public static volatile MapAttribute<HistoricalEvent, String, HistoricalEventMap> data;
}
Here is my Asset JPA Object:
#Entity
#Table(name = "LC_ASSET_TAB")
public class Asset extends BaseEntity implements Comparable<Asset>, Serializable
{
private static final long serialVersionUID = 1L;
#ManyToOne(fetch = FetchType.EAGER, cascade = CascadeType.ALL, targetEntity = PartCatalog.class)
#JoinColumn(name = "PART_CATALOG_ID", nullable = false)
private PartCatalog partCatalogItem;
#Column(name = "SERIAL_NO", nullable = false)
private String serialNumber;
#Column(name = "DATE_INTO_SERVICE", nullable = false)
#Temporal(value = TemporalType.TIMESTAMP)
private Calendar dateIntoService;
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "asset", targetEntity = AssetMap.class)
#MapKey(name = "fieldName")
#BatchSize(size=25)
private Map<String, AssetMap> data;
}
Asset Metamodel:
#StaticMetamodel(PartCatalog.class)
public class PartCatalog_ extends BaseEntity_
{
public static volatile SingularAttribute<PartCatalog, String> partID;
public static volatile SingularAttribute<PartCatalog, String> nsn;
public static volatile SingularAttribute<PartCatalog, String> description;
public static volatile MapAttribute<PartCatalog, String, PartCatalogMap> data;
}
Here is my Part Catalog JPA object:
#Entity
#Table(name = "LC_PART_CATALOG_TAB")
public class PartCatalog extends BaseEntity implements Comparable<PartCatalog>, Serializable
{
private static final long serialVersionUID = 1L;
#Column(name = "PART_ID", length=100, nullable = false)
private String partID;
#Column(name = "NSN", length=100, nullable = true)
private String nsn;
#Column(name = "DESCRIPTION", length=250, nullable = false)
private String description;
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.EAGER, mappedBy = "partCatalogItem", targetEntity = PartCatalogMap.class)
#MapKey(name = "fieldName")
private Map<String, PartCatalogMap> data;
}
Part Catalog Metamodel:
#StaticMetamodel(PartCatalog.class)
public class PartCatalog_ extends BaseEntity_
{
public static volatile SingularAttribute<PartCatalog, String> partID;
public static volatile SingularAttribute<PartCatalog, String> nsn;
public static volatile SingularAttribute<PartCatalog, String> description;
public static volatile MapAttribute<PartCatalog, String, PartCatalogMap> data;
}
Specification Predicate for returning historical events by a given Part Number and Serial Number:
PROBLEM: If includeAssociations is false, it returns fine however soon as it is true, it returns the wrong list of associations and never returns any results from the events the asset is directly tied too like if the includeAssociations was false. This is where I need help how to best write the criteria builder query to properly pull the data.
These are the two JPQL queries I am trying to combine into the Predicate using the Criteria API:
Normal:
#Query("SELECT he FROM HistoricalEvent he WHERE he.asset.partCatalogItem.partID =:partID AND he.asset.serialNumber =:serialNumber " +
"AND he.startDate >:startDate AND he.endDate <:endDate")
Association:
#Query("SELECT he FROM HistoricalEvent he INNER JOIN he.associatedAssets associated WHERE associated.partCatalogItem.partID =:partID AND associated.serialNumber =:serialNumber " +
"AND he.startDate >:startDate AND he.endDate <:endDate");
/**
* Creates a specification used to find historical events by a given asset part number and serial
* parameter.
*
* #param partID - part identifier
* #Param serialNumber
* #return Historical Event Specification
*/
public static Specification<HistoricalEvent> hasPartAndSerial(final String partID, final String serialNumber, final Boolean includeAssociations)
{
return new Specification<HistoricalEvent>() {
#Override
public Predicate toPredicate(Root<HistoricalEvent> historicalEventRoot,
CriteriaQuery<?> query, CriteriaBuilder cb) {
if (partID == null || partID == "")
{
return null;
}
if(serialNumber == null || serialNumber =="")
{
return null;
}
Path<Asset> assetOnEvent = historicalEventRoot.get(HistoricalEvent_.asset);
Path<PartCatalog> partCatalogItem = assetOnEvent.get(Asset_.partCatalogItem);
Expression<String> partIdToMatch = partCatalogItem.get(PartCatalog_.partID);
Expression<String> serialToMatch = assetOnEvent.get(Asset_.serialNumber);
if(includeAssociations)
{
SetJoin<HistoricalEvent, Asset> assetsAssociatedToEvent = historicalEventRoot.join(HistoricalEvent_.associatedAssets);
Path<PartCatalog> partCatalogItemFromAssociatedAsset = assetsAssociatedToEvent.get(Asset_.partCatalogItem);
Expression<String> partIdToMatchFromAssociatedAsset = partCatalogItemFromAssociatedAsset.get(PartCatalog_.partID);
Expression<String> serialToMatchFromAssociatedAsset = assetsAssociatedToEvent.get(Asset_.serialNumber);
return cb.or(cb.and(cb.equal(cb.lower(partIdToMatch), partID.toLowerCase()), cb.equal(cb.lower(serialToMatch), serialNumber.toLowerCase())),
cb.and(cb.equal(cb.lower(partIdToMatchFromAssociatedAsset), partID.toLowerCase()), cb.equal(cb.lower(serialToMatchFromAssociatedAsset), serialNumber.toLowerCase())));
}
else
{
return cb.and(cb.equal(cb.lower(partIdToMatch), partID.toLowerCase()), cb.equal(cb.lower(serialToMatch), serialNumber.toLowerCase()));
}
}
};
}
Finally I am calling this to find the historical events:
#Override
public Page<HistoricalEvent> getByCriteria(String type, String partID,
String serialNumber, Calendar startDate, Calendar endDate,
Boolean includeAssociations, Integer pageIndex, Integer recordsPerPage)
{
LOGGER.info("HistoricalEventDatabaseServiceImpl - getByCriteria() - Searching historical event repository for type of " + type + " , part id of " + partID +
" , serial number of " + serialNumber + " , start date of " + startDate + " , end date of " + endDate + ", include associations flag of " + includeAssociations
+ " , pageIndex " + pageIndex + " and records per page of " + recordsPerPage);
Page<HistoricalEvent> requestedPage = historicalEventRepository.findAll(Specifications
.where(HistoricalEventSpecifications.hasType(type))
.and(HistoricalEventSpecifications.greaterThanOrEqualToStartDate(startDate))
.and(HistoricalEventSpecifications.lessThanOrEqualToEndDate(endDate))
.and(HistoricalEventSpecifications.hasPartAndSerial(partID, serialNumber, includeAssociations)),
DatabaseServicePagingUtil.getHistoricalEventPagingSpecification(pageIndex, recordsPerPage));
LOGGER.info("HistoricalEventDatabaseServiceImpl - getByCriteria() - Found " + requestedPage.getTotalElements() + " that will comprise " + requestedPage.getTotalPages() + " pages of content.");
return requestedPage;
} UPDATE: i have been able to get the specification if the historical event was either directly or indirectly associated working however using the following Predicate 1 = cb.equals(cb.lower(partIDToMatch, partID.toLowercase()); Predicate2 = cb.equals(cb.lower(serialToMatch), serialNumber.toLowercase(); Predicate3 = cb.or(Predicate1, Predicate2 ); Predicate4 = cb.equals(cb.lower(partIDToMatchFromAssociatedAsset), partIDToMatch.toLowercase()); Predicate5 = cb.equals(cb.lower(serialNumberFromAssociatedAsset), serialNumberToMatch.toLowercase()); Predicate6 = cb.and(Predicate4, Predicate5); Predicate7 = cb.or(Predicate3,Predicate6); When i return Predicate I only get results matching Predicate6 not either one as i would expect. I want it to pull events where either predicate condition returns a record. Each predicate returns the right data but when i use the cb.or it doesnt combine results as i would expect. What am I missing?
You have to start printing the query and parameters value that are bean generated, just enable this properties.
After that you have to analyze your query and make some tests with different combinations to check your jpa specification are falling.
There is no magic way to do that and it's hard and painful :(
Good look