I have a parent which stores a list of children. When i update the children(add/edit/remove), is there a way to automatically decide which child to remove or edit based on the foreign key? Or do i have to manually check through all the child to see which are new or modified?
Parent Class
#Entity
#EntityListeners(PermitEntityListener.class)
public class Permit extends Identifiable {
#OneToMany(fetch = FetchType.LAZY, cascade=CascadeType.ALL, mappedBy = "permit")
private List<Coordinate> coordinates;
}
Child Class
#Entity
public class Coordinate extends Identifiable {
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "permit_id", referencedColumnName = "id")
private Permit permit;
private double lat;
private double lon;
}
Parent's Controller
#PutMapping("")
public #ResponseBody ResponseEntity<?> update(#RequestBody Permit permit) {
logger.debug("update() with body {} of id {}", permit, permit.getId());
if (!repository.findById(permit.getId()).isPresent()) {
return ResponseEntity.status(HttpStatus.BAD_REQUEST).body(null);
}
Permit returnedEntity = repository.save(permit);
repository.flush();
return ResponseEntity.ok(returnedEntity);
}
=EDIT=
Controller Create
#Override
#PostMapping("")
public #ResponseBody ResponseEntity<?> create(#RequestBody Permit permit) {
logger.debug("create() with body {}", permit);
if (permit == null || permit.getId() != null) {
return ResponseEntity.status(HttpStatus.BAD_REQUEST).body(null);
}
List<Coordinate> coordinates = permit.getCoordinates();
if (coordinates != null) {
for (int x = 0; x < coordinates.size(); ++x) {
Coordinate coordinate = coordinates.get(x);
coordinate.setPermit(permit);
}
}
Permit returnedEntity = repository.save(permit);
repository.flush();
return ResponseEntity.ok(returnedEntity);
}
Controller Update
#PutMapping("")
public #ResponseBody ResponseEntity<?> update(#RequestBody Permit permit) {
logger.debug("update() with body {} of id {}", permit, permit.getId());
if (!repository.findById(permit.getId()).isPresent()) {
return ResponseEntity.status(HttpStatus.BAD_REQUEST).body(null);
}
List<Coordinate> repoCoordinate = coordinateRepository.findByPermitId(permit.getId());
List<Long> coordinateIds = new ArrayList<Long>();
for (Coordinate coordinate : permit.getCoordinates()) {
coordinate.setPermit(permit);
//if existing coordinate, save the ID in coordinateIds
if (coordinate.getId() != null) {
coordinateIds.add(coordinate.getId());
}
}
//loop through coordinate in repository to find which coordinate to remove
for (Coordinate coordinate : repoCoordinate) {
if (!(coordinateIds.contains(coordinate.getId()))) {
coordinateRepository.deleteById(coordinate.getId());
}
}
Permit returnedEntity = repository.save(permit);
repository.flush();
return ResponseEntity.ok(returnedEntity);
}
I have tested this and it is working, is there no simplified way of doing this?
You were close to the solution. The only thing you're missing is orphanRemoval=true on your one to many mapping:
#Entity
#EntityListeners(PermitEntityListener.class)
public class Permit extends Identifiable {
#OneToMany(mappedBy = "permit", cascade=CascadeType.ALL, orphanRemoval=true)
private List<Coordinate> coordinates;
}
Flagging the mapping for orphan removal will tell the underlying ORM to delete any entities that no longer belong to any parent entity. Since you removed a child element from the list, it will be deleted when you save the parent element.
Creating new elements and updating old is based on the CascadeType. Since you have CascadeType.ALL all elements in the list without an ID will be saved to the database and assigned a new ID when you save the parent entity, and all elements that are already in the list and have an ID will be updated.
On a side note, you might need to update the setter method for List coordinates to look something like:
public void setCoordinates(List<Coordinates> coordinates) {
this.coordinates = coordinates;
this.coordinates.forEach(coordinate -> coordinates.setPermit(this));
}
Or simply use #JsonManagedReference and #JsonBackReference if you're working with JSON.
I have a parent which stores a list of children.
Lets write the DDL for it.
TABLE parent (
id integer pk
)
TABLE child(
id integer pk
parent_id integer FOREIGN KEY (parent.id)
)
When i update the children(add/edit/remove), is there a way to automatically decide which child to remove or edit based on the foreign key?
Assuming you have a new child #5 bound to the parent #2 and:
The FK in the DDL is correctly
The entitys knows the FK
You are using the same jpa-context
The transaction is executed correctly
Then every call to parent.getChilds() must(!) return all the entitys that are existing before your transaction has been executed and the same instance of the entity that you have just committed to the database.
Then, if you remove child #5 of parent #2 and the transaction executed successfully parent.getChilds() must return all entitys without child #5.
Special case:
If you remove parent #2 and you have cascade-delete in the DDL as well as in the Java-Code all childrens must be removed from the Database as well as the parent #2 in the Database you just removed. In this case the parent #2 is not bound anymore to the jpa-context and all the childrens of parent #2 are not bound anymore to the jpa-context.
=Edit=
You could use merge. This will work for constructs like this:
POST {
"coordinates": [{
"lat":"51.33",
"lon":"22.44"
},{
"lat":"50.22",
"lon":"22.33"
}]
}
It will create one row in table "permit" and two rows in table "coordinate", both coordinates are bound to the permit-row. The result will include the ids set.
But: You will have to do the validation work (check that id is null, check that coordinates not refering different permit, ...)!
The removal of coordinates must be done using the DELETE method:
DELETE /permit/972/coordinate/3826648305
Related
I've ran into a problem while developing a Spring Boot application with Criteria API.
I'm having a simple Employer entity, which contains a set of Job.ID (not entities, they're pulled out using repository when needed). Employer and Job are in many to many relationship. This mapping is only used on a purpose of finding Employee with no jobs.
public class Employer {
#ElementCollection
#CollectionTable(
name = "EMPLOYEE_JOBS"
joinColumns = #JoinColumn(name = "EMP_ID")
#Column(name = "JOB_ID")
private final Set<String> jobs = new HashSet<>(); //list of ids of jobs for an employee
}
Then I have a generic function, which returns a predicate (Specification) by a given attributePath and command for any IEntity implementation.
public <E extends IEntity> Specification<E> createPredicate(String attributePath, String command) {
return (r, q, b) -> {
Path<?> currentPath = r;
for(String attr : attributePath.split("\\.")) {
currentPath = currentPath.get(attr);
}
if(Collection.class.isAssignableFrom(currentPath.getJavaType())) {
//currentPath points to PluralAttribute
if(command.equalsIgnoreCase("empty")) {
return b.isEmpty((Expression<Collection<?>>)currentPath);
}
}
}
}
If want to get list of all employee, who currently have no job, I wish I could create the predicate as follows:
Specification<Employer> spec = createPredicate("jobs", "empty");
//or if I want only `Work`s whose were done by employer with no job at this moment
Specification<Work> spec = createPredicate("employerFinished.jobs", "empty");
This unfortunately does not works and throws following exception:
org.hibernate.hql.internal.ast.QuerySyntaxException:
unexpected end of subtree
[select generatedAlias0 from Employer as generatedAlias0
where generatedAlias0.jobs is empty]
Is there a workaround how to make this work?
This bug in Hibernate is known since September 2011, but sadly hasn't been fixed yet. (Update: this bug is fixed as of 5.4.11)
https://hibernate.atlassian.net/browse/HHH-6686
Luckily there is a very easy workaround, instead of:
"where generatedAlias0.jobs is empty"
you can use
"where size(generatedAlias0.jobs) = 0"
This way the query will work as expected.
I am new to mongodb and struggling to understand how document update works.
I have a document called 'menu':
{
"someId":"id123",
"someProperty":"property123",
"list" : [{
"innerProperty":"property423"
}]
}
which maps to my entity:
#Document(collection = "menu")
public class Menu {
#Id
private String id;
private String someid;
private String someProperty;
private List<SomeClass> list;
// accessors
}
when I try to find and update this document like this it does not update the document. It sure does find the menu as as it returns the original entity with Id:
#Override
public Menu update(Menu menu) {
Query query = new Query(
Criteria.where("someId").is(menu.getSomeId()));
Update update = Update.update("menu", menu);
return mongoOperations.findAndModify(query, update,
FindAndModifyOptions.options().returnNew(true), Menu.class);
}
But if I change it to this, it works:
#Override
public Menu update(Menu menu) {
Query query = new Query(
Criteria.where("someId").is(menu.getSomeId()));
Update update = new Update().set("someProperty", menu.getSomeProperty())
.set("list", menu.getList());
return mongoOperations.findAndModify(query, update,
FindAndModifyOptions.options().returnNew(true), Menu.class);
}
I don't really like this second method where each element of the document is individually set, as you might imagine I have a rather large document and is prone to errors.
Why does the first method not work? And what could be a better approach to update the document?
Check out the docs for findAndModify - it returns the version of the document before the fields were modified. If you do a new find() straight after, you will see that your changes were actually saved to MongoDB.
I am trying to make a List of all of the books in one Collection that are not present in another. My problem is that I need to compare based on book ID, so I can't just test to see whether a book in the first is contained in the second, I have to determine whether any book in the second collection has the same ID as a book in the first.
I have the below code to compare two collections of books and filter the first collection:
List<Book> parentBooks = listOfBooks1.stream().filter(book->
!listOfBooks2.contains(book)).collect(Collectors.toList());
The code doesn't work correctly because I am comparing the objects themselves. I need to compare the objects based on the bookId instead of the whole book object. How should I change the code so it can do the comparison based on the bookId (book.getId())?
List<Book> books1 = ...;
List<Book> books2 = ...;
Set<Integer> ids = books2.stream()
.map(Book::getId)
.collect(Collectors.toSet());
List<Book> parentBooks = books1.stream()
.filter(book -> !ids.contains(book.getId()))
.collect(Collectors.toList());
The problem is complex, but it boils down to one thing, knows your data. Is it immutables, entities with an id, duplicate entries etc?
The code below works for immutables with only values (and with possible duplicates).
It first tries to remove all entries in the before list (from the copied after-list).
What is left will be the added elements. The ones from the before-list that can be removed from the after-list are the unchanged ones.
The rest are the removed ones
public class ListDiffer<T> {
private List<T> addedList = new ArrayList<>();
private List<T> unchangedList = new ArrayList<>();
private List<T> removedList = new ArrayList<>();
public ListDiffer(List<T> beforeList, List<T> afterList) {
addedList.addAll(afterList); // Will contain only new elements when all elements in the Before-list are removed.
beforeList.forEach(e -> {
boolean b = addedList.remove(e) ? unchangedList.add(e) : removedList.add(e);
});
}
public List<T> getAddedList() {
return addedList;
}
public List<T> getUnchangedList() {
return unchangedList;
}
public List<T> getRemovedList() {
return removedList;
}
}
In my current project I have an entity which can be published to other systems. For keeping track on the publications the entity has a relation called "publications". I am using Eclipselink.
This entity bean also has a "PreUpdate" annotated method.
In order to be able to keep the other systems data up to date, I created an Aspect that is executed around the call to the PreUpdate method. Depending on which properties have changed, I need to remove some of the publications. Everything is working absolutely fine.
The problem I am having is that the portal-publishing component correctly sends delete commands and removes the publication from the entities "publications" list. I can even see in the changeset that JPA has noticed the "publications" property to have changed. After the transaction is flushed, the cached entity correctly doesn't have the deleted publications anymore. Unfortunately the database still does and when the system is restarted or the Entity is loaded from the DB again, the publication metadata is there again.
I tried allmost everything. I even managed to get the deleted instances from the JPA ChangeSet in the Aspect and tried to use the entityManager to manually delete them, but nothing actually worked. I seem to be unable to delete these relational entities. Currently I am thinking about using JDBC to delete them, but this would only be my last measure.
#Transactional
#Around("execution(* de.cware.services.truck.model.Truck.jpaPreUpdate(..))")
public Object truckPreUpdate(final ProceedingJoinPoint pjp) throws Throwable {
if (alreadyExecutingMarker.get() != Boolean.TRUE) {
alreadyExecutingMarker.set(Boolean.TRUE);
final Truck truck = (Truck) pjp.getTarget();
final JpaEntityManager jpaEntityManager = (JpaEntityManager) entityManager.getDelegate();
final UnitOfWorkChangeSet changeSet = jpaEntityManager.getUnitOfWork().getCurrentChanges();
final ObjectChangeSet objectChangeSet = changeSet.getObjectChangeSetForClone(truck);
if (log.isDebugEnabled()) {
log.debug("--------------------- Truck pre update check (" + truck.getId() + ") ---------------------");
}
////////////////////////////////////////////////////////////////////////////////////////////////////////////
// If the truck date has changed, revoke all publication copies.
////////////////////////////////////////////////////////////////////////////////////////////////////////////
final ChangeRecord truckFreeDate = objectChangeSet.getChangesForAttributeNamed("lkwFreiDatum");
if (truckFreeDate != null) {
if (log.isDebugEnabled()) {
log.debug("The date 'truckFreeDate' of truck with id '" + truck.getId() + "' has changed. " +
"Revoking all publications that are not marked as main applications");
}
for (final String portal : truck.getPublishedPortals()) {
if (log.isDebugEnabled()) {
log.debug("- Revoking publications of copies to portal: " + portal);
}
portalService.deleteCopies(truck, portal);
// Get any deleted portal references and use the entityManager to finally delete them.
changeSet = jpaEntityManager.getUnitOfWork().getCurrentChanges();
objectChangeSet = changeSet.getObjectChangeSetForClone(truck);
final ChangeRecord publicationChanges = objectChangeSet.getChangesForAttributeNamed("publications");
if (publicationChanges != null) {
if (publicationChanges instanceof CollectionChangeRecord) {
final CollectionChangeRecord collectionChanges =
(CollectionChangeRecord) publicationChanges;
#SuppressWarnings("unchecked")
final Collection<ObjectChangeSet> removedPublications =
(Collection<ObjectChangeSet>)
collectionChanges.getRemoveObjectList().values();
for (final ObjectChangeSet removedPublication : removedPublications) {
final TruckPublication publication = (TruckPublication) ((org.eclipse.persistence.internal.sessions.ObjectChangeSet) removedPublication).getUnitOfWorkClone();
entityManager.remove(publication);
}
}
}
}
}
}
}
Chris
The issue is that PreUpdate is raised during the commit process, when the set of changes have already been computed, and the set of objects to delete have already been computed.
Ideally you would perform something like this in your application logic, not through a persistence event.
You could try executing a DeleteObjectQuery directly from your event (instead of using em.remove()), this may work, but in general it would be better to perform this logic in your application.
jpaEntityManager.getUnitOfWork().deleteObject(object);
Also note that getCurrentChanges() computes the changes, in a PreUpdate event the changes are already computed, so you should be able to use getUnitOfWorkChangeSet().
The only solution I found was to create a new Method for performing the delete and forcing JPA to create a new Transaction. As by this I am losing the changeSet, I had to manually find out which publications were deleted. I then simply call that helper method and the publications are correctly deleted, but I find this solution extremely ugly.
#Transactional
#Around("execution(* de.cware.services.truck.model.Truck.jpaPreUpdate(..))")
public Object truckPreUpdate(final ProceedingJoinPoint pjp) throws Throwable {
if (alreadyExecutingMarker.get() != Boolean.TRUE) {
alreadyExecutingMarker.set(Boolean.TRUE);
final Truck truck = (Truck) pjp.getTarget();
final JpaEntityManager jpaEntityManager = (JpaEntityManager) entityManager.getDelegate();
final UnitOfWorkChangeSet changeSet = jpaEntityManager.getUnitOfWork().getCurrentChanges();
final ObjectChangeSet objectChangeSet = changeSet.getObjectChangeSetForClone(truck);
if (log.isDebugEnabled()) {
log.debug("--------------------- Truck pre update check (" + truck.getId() + ") ---------------------");
}
////////////////////////////////////////////////////////////////////////////////////////////////////////////
// If the truck date has changed, revoke all publication copies.
////////////////////////////////////////////////////////////////////////////////////////////////////////////
final ChangeRecord truckFreeDate = objectChangeSet.getChangesForAttributeNamed("lkwFreiDatum");
if (truckFreeDate != null) {
if (log.isDebugEnabled()) {
log.debug("The date 'truckFreeDate' of truck with id '" + truck.getId() + "' has changed. " +
"Revoking all publications that are not marked as main applications");
}
removeCopyPublications(truck);
}
}
#Transactional(propagation = Propagation.REQUIRES_NEW)
protected void removeCopyPublications(Truck truck) {
// Delete all not-main-publications.
for (final String portal : truck.getPublishedPortals()) {
if (log.isDebugEnabled()) {
log.debug("- Revoking publications of copies to portal: " + portal);
}
final Map<Integer, TruckPublication> oldPublications = new HashMap<Integer, TruckPublication>();
for(final TruckPublication publication : truck.getPublications(portal)) {
oldPublications.put(publication.getId(), publication);
}
portalService.deleteCopies(truck, portal);
for(final TruckPublication publication : truck.getPublications(portal)) {
oldPublications.remove(publication.getId());
}
for (TruckPublication removedPublication : oldPublications.values()) {
if(!entityManager.contains(removedPublication)) {
removedPublication = entityManager.merge(removedPublication);
}
entityManager.remove(removedPublication);
entityManager.flush();
}
}
}
Why doesn't my first version work?
I had a similar problem, I have a class and its children, when I remove the children from the parent they were deleted from the DB, then I attached new children using merge on the class Parent (CascadeType.ALL) using JPA/EclipseLink, then the children didn't create on the DB but in the persistency Motor (JPA). I fix this doing the following:
1- I set shared-cache-mode to NONE in the persistence.xml file
2- When I remove the children, I executed inmediatly this:
public void remove(T entity) {
getEntityManager().remove(getEntityManager().merge(entity));
getEntityManager().getEntityManagerFactory().getCache().evictAll();
}
And that's all. I hope this would help anyone else.
CHECK REFERENCE
http://wiki.eclipse.org/EclipseLink/Examples/JPA/Caching
I´ve got this on the parent object
#OneToMany(mappedBy="idUser", cascade = CascadeType.MERGE)
public List<Directions> directions;
And in my controller I´ve got this
public static void userUpdate(String apikey, JsonObject body) {
if(validate(apikey)) {
Long idUser = decode(apikey);
User oldUser = User.findById(idUser);
Map<String, User> userMap = new HashMap<String, User>();
Type arrayListType = new TypeToken<Map<String, User>>(){}.getType();
userMap = gson().fromJson(body, arrayListType);
User user = userMap.get("user");
oldUser.em().merge(user);
oldUser.save();
}else{
forbidden();
}
}
It makes the update on the parent object but when I change something on the children object it doesn't update it and neither gives problems with hibernate or Oracle.
Does anyone know why it doesn´t update the child object?
Thanks all!
Updated with solution!
This is how it works for me, as #JB Nizet said you´ve got to save the child objects too.
oldUser.em().merge(user);
oldUser.save();
for (Direction direction : oldUser.directions) {
direction.save();
}
Another aproach!
With this
#OneToMany(cascade = CascadeType.ALL)
#JoinColumn(name = "SASHNBR", insertable = true, updatable = true)
public List<Direction> directions;
I´ve been able to make oldUser.save() and get the child objects saved.
AFAIK, Play requires a call to save() on all the modified entities. So you probably need to iterate through the user's directions and save them as well:
for (Direction direction : user.getDirections()) {
direction.save();
}