Sync a #ManyToOne relationship after saveAndFlush with spring data jpa - spring-boot

I have this entity in a spring boot project. Having the insertable/updatable false is useful in many cases but causes problems when saving the entity. Here's the code.
#Entity
#Table(name = "BOOK", schema = "DIST")
public class Book {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "ID", nullable = false)
private Integer id;
#ManyToOne
#JoinColumn(name = "AUTHOR_ID", insertable = false, updatable = false)
private Author author;
#Column(name = "AUTHOR_ID")
private Integer authorId;
and this method to save a new book
#Transational
public void test() {
Book book = new Book();
book.setAuthorId(1);
book = bookRepository.saveAndFlush(book);
book.getAuthor().getName(); //author is null
}
The getAuthor() relationship is always null after saving, I'm wondering if there's a way to fetch it. In other words I'd like to be able to force the ManyToOne relationship to be populated but I cannot find a clean way to do it.
This approach doesn't work.
#Transational
public void test() {
Book book = new Book();
book.setAuthorId(1);
book = bookRepository.saveAndFlush(book);
book = bookRepository.findById(book.getId());
book.getAuthor().getName(); //author is still null
}
This one seems to be an options but I don't like it.
#Transational
public void test() {
Book book = new Book();
book.setAuthorId(1);
book = bookRepository.saveAndFlush(book);
entityManager.clear()
book = bookRepository.findById(book.getId());
book.getAuthor().getName(); //author is NOT null
}
Any hints?

I happen to be having and have been investigating how to solve this same issue.
Our data model/use case also requires us to have the foreign key field explicitly defined as a number as well as including the referenced object for being able to easily access its properties.
It seems JPA is expecting us to physically populate the nested object before calling save. Most of the examples on the web only have the many to one field and not a separate numeric ID column field
Adding cache isolation annotation to Book:
#Entity
#Table(name = "BOOK", schema = "DIST")
#Cache(isolation = CacheIsolationType.ISOLATED)
public class Book {
appears to be a solution, in my opinion this is less clunky than explicitly calling EntityManager.clear(). But still requires an explicit getById call after the save. In my scenario I already had the isolation defined for solving other issues so adding the extra get is not a big issue. This code example you gave should work after adding the cache isolation annotation:
#Transational
public void test() {
Book book = new Book();
book.setAuthorId(1);
book = bookRepository.saveAndFlush(book);
book = bookRepository.findById(book.getId());
book.getAuthor().getName();
}
I haven't yet been able to find a better solution. I'll keep an eye on this question to see if anyone provides one.

You can try to refresh the object from the Session of the EntityManager
example:
import javax.persistence.EntityManager;
#Transational
public void test() {
Book book = new Book();
book.setAuthorId(1);
book = bookRepository.saveAndFlush(book);
// this will re-read the state of give object
// from underlying database
entityManager.refresh(book);
book.getAuthor().getName(); //author is NOT null
}

Related

Axon - State Stored Aggregates exception in test

Environment setup : Axon 4.4, H2Database( we are doing component testing as part of the CI)
Code looks something like this.
#Aggregate(repository = "ARepository")
#Entity
#DynamicUpdate
#Table(name = "A")
#Getter
#Setter
#NoArgsConstructor
#EqualsAndHashCode(onlyExplicitlyIncluded = true, callSuper = false)
#Log4j2
Class A implements Serializable {
#CommandHandler
public void handle(final Command1 c1) {
apply(EventBuilder.buildEvent(c1));
}
#EventSourcingHandler
public void on(final Event1 e1) {
//some updates to the modela
apply(new Event2());
}
#Id
#AggregateIdentifier
#EntityId
#Column(name = "id", length = 40, nullable = false)
private String id;
#OneToMany(
cascade = CascadeType.ALL,
fetch = FetchType.LAZY,
orphanRemoval = true,
targetEntity = B.class,
mappedBy = "id")
#AggregateMember(eventForwardingMode = ForwardMatchingInstances.class)
#JsonIgnoreProperties("id")
private List<C> transactions = new ArrayList<>();
}
#Entity
#Table(name = "B")
#DynamicUpdate
#Getter
#Setter
#NoArgsConstructor
#EqualsAndHashCode(onlyExplicitlyIncluded = true, callSuper = false)
#Log4j2
Class B implements Serializable {
#Id
#EntityId
#Column(name = "id", nullable = false)
#AggregateIdentifier
private String id;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumns({#JoinColumn(name = "id", referencedColumnName = "id")})
#JsonIgnoreProperties("transactions")
private A a;
#EventSourcingHandler
public void on(final Event2 e2) {
//some updates to the model
}
}
I'm using a state store aggregate but I keep getting the error randomly during Spring Test with embedded H2. The same issue does not occur with a PGSQL DB in non embedded mode but than we are not capable of runnign it in the pipeline.
Error : "java.lang.IllegalStateException: The aggregate identifier has not been set. It must be set at the latest when applying the creation event"
I stepped through AnnotatedAggregate
protected <P> EventMessage<P> createMessage(P payload, MetaData metaData) {
if (lastKnownSequence != null) {
String type = inspector.declaredType(rootType())
.orElse(rootType().getSimpleName());
long seq = lastKnownSequence + 1;
String id = identifierAsString();
if (id == null) {
Assert.state(seq == 0,
() -> "The aggregate identifier has not been set. It must be set at the latest when applying the creation event");
return new LazyIdentifierDomainEventMessage<>(type, seq, payload, metaData);
}
return new GenericDomainEventMessage<>(type, identifierAsString(), seq, payload, metaData);
}
return new GenericEventMessage<>(payload, metaData);
}
The sequence for this gets set to 2 and hence it throws the exception instead of lazily initializing the aggregate
Whats the fix for this? Am i missing some configuration or needs a fix in Axon code?
I believe the exception you are getting is the pointer to what you are missing #Rohitdev. When an aggregate is being created in Axon, it at the very least assume you will set the aggregate identifier. Thus, that you will fill in the #AggregateIdentifier annotated field present in your Aggregate.
This is a mandatory validation as without an Aggregate Identifier, you are essentially missing the external reference towards the Aggregate. Due to this, you would simply to be able to dispatch following commands to this Aggregate, as there is no means to route them.
From the code snippets you've shared, there is nothing which indicates that the #AggregateIdentifier annotated String id fields in Aggregate A or B are ever set. Not doing this in combination with using Axon's test fixtures will lead you the the exception you are getting.
When using a state-stored aggregate, know that you will change the state of the aggregate inside the command handler. This means that next to invoke in the AggregateLifecycle#apply(Object) method in your command handler, you will set the id to the desired aggregate identifier.
There are two main other pointers to share based on the question.
There is no command handler inside your aggregate which creates the aggregate itself. You should either have an #CommandHandler annotated constructor in your aggregates, or use the #CreationPolicy annotation to define a regular method as the creation point of the aggregate (as mentioned here in the reference guide).
Lastly, your sample still uses #EventSourcingHandler annotated functions, which should be used when you have an Event Sourced Aggregate. It sounds like you have made a conscious decision against Event Sourcing, hence I wouldn't use those annotations either in your model. Right now it will likely only confuse developers that a mix of state-stored and event sourced aggregate logic is being used.
Finally after debugging we found out that in class B we were not setting the id for update event
#EventSourcingHandler
public void on(final Event2 e2) {
this.id=e2.getId();
}
Once we did that the issue went away.

Why do I need #Transactional for saving an OneToOne mapped Entity

I have a simple straight forward demo application with spring-boot, spring-data-jpa and a h2-DB.
I have build two entities which are mapped by an OneToOne relationship.
Post.java
#Entity
public class Post {
#Id
#GeneratedValue
private Long id;
private String title;
#OneToOne(mappedBy = "post", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private PostDetail postDetail;
}
PostDetail.java
#Entity
public class PostDetail {
#Id
#GeneratedValue
private Long id;
private String message;
#OneToOne(fetch = FetchType.LAZY)
#MapsId
#JoinColumn(name = "id")
private Post post;
}
I try to create and save a new Post. Then I try to create a new PostDetail, set the previous generated Post to it and save it. In the one controller sample I dont have a #Transactional annotation and in the seconde sample I do annotate the method with #Transactional
#RestController
public class TestController {
#Autowired
PostRepository postRepository;
#Autowired
PostDetailRepository postDetailRepository;
#GetMapping("/test1")
public String test1() {
Post post = new Post();
post.setId(2L);
post.setTitle("Post 1");
postRepository.save(post);
PostDetail detail = new PostDetail();
detail.setMessage("Detail 1");
detail.setPost(post);
postDetailRepository.save(detail);
return "";
}
#Transactional
#GetMapping("/test2")
public String test2() {
Post post = new Post();
post.setId(2L);
post.setTitle("Post 1");
postRepository.save(post);
PostDetail detail = new PostDetail();
detail.setMessage("Detail 1");
detail.setPost(post);
postDetailRepository.save(detail);
return "";
}
}
Why do I get in the first sample a org.hibernate.PersistentObjectException: detached entity passed to persist: com.example.demo.jpa.model.Post exception and in the other sample not?
Can anyone explain why this happens?
You use bidirectional #OneToOne association. As hibernate documentation states:
Whenever a bidirectional association is formed, the application developer must make sure both sides are in-sync at all times.
So, you should rewrite your test method in this way:
#GetMapping("/test1")
public String test1() {
Post post = new Post();
post.setId(2L);
post.setTitle("Post 1");
PostDetail detail = new PostDetail();
detail.setMessage("Detail 1");
// synchronization of both sides of #OneToOne association
detail.setPost(post);
post.setDetail(detail);
// thanks to CascadeType.ALL on Post.postDetail
// postDetail will be saved too
postRepository.save(post);
return "";
}
You shouldn’t be saving those 2 entities separately — you should set PostDetail inside of post object and save only the Post object. Hibernate will take care of saving the aggregated PostDetail.
That is why you are getting PersistentObjectException which you are able to workaround by keeping it inside of the same transaction.
we do not always need a bidirectional mapping when we are mapping two entities
you can simple have a unidirection most of the time
Post post = new Post();
post.setId(2L);
post.setTitle("Post 1");
PostDetail detail = new PostDetail();
detail.setMessage("Detail 1");
detail.setPost(post);
postRepository.save(post);
as you have cascade.all ,so hibernate saves Post first and then it saves PostDetail, now as per the rule of Transaction behavior ,either it is completely done or not done,Hence we can not have the situation that Post is saved but PostDetail did not,Hence to avoid such ambiguity it is important to have #Transaction annotation ,at method level or may be class level as per your requirement

Spring Boot many to many post method not updating data

My User class looks like this :
#Data
#Entity
public class User {
#Id
Long userID;
#ManyToMany(mappedBy = "admins")
private List<ClassRoom> classRooms = new ArrayList<>();
}
And my ClassRoom class like this :
#Data
#Entity
public class ClassRoom {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
Long classRoomID;
#ManyToMany
#JoinTable(name ="classroom_user",
joinColumns = #JoinColumn(name = "classroom_id"),
inverseJoinColumns = #JoinColumn(name = "user_id"))
private List<User> admins = new ArrayList<>();
}
And in my UserController class, I have :
#PostMapping("user/{id}/c")
User addClassRoom(#PathVariable Long id,#RequestBody ClassRoom newClassRoom)
{
logger.debug(repository.findById(id));
return repository.findById(id)
.map(user -> {
user.getClassRooms().add(newClassRoom);
user.setClassRooms(user.getClassRooms());
return repository.save(user);
})
.orElseGet(() -> {
return null;
});
}
And I POST and empty JSON ({}) and I see no change in my users. The Classroom or an empty Classroom doesn't get added in the User.
What is the problem here? How can I resolve this ?
user.getClassRooms().add(newClassRoom); is suffice, user.setClassRooms(user.getClassRooms()); not required.
You will have to perform cascade save operation.List all cascade types explicitly and don't use mappedBy, instead use joincolumns annotation.
Can you paste the logs, please? Is Hibernate doing any insert into your table? Has the database schema been created in the DB correctly? One thing I recommend you to do is to add a custom table name on the top of your User class, using annotations like so: #Table(name = "users"). In most SQL dialects user is a reserved keyword, hence it is recommended to always annotate User class a bit differently, so that Hibernate won't have any problems to create a table for that entity.
IMO you must find classRoom by its id from repository, if it's new, you must create a new entity and save it first. Then assign it to user and save it.
The object you receive from the post method was not created by the entity manager.
After using user.getClassRooms().add(newClassRoom);
We must use userRepository.save(user);

Cannot Query Neo4j Repositories

Hey everyone I am fairly new to Neo4j and am having an issue querying my repositories.
Repository is the follow:
public interface NodeOneRepository extends GraphRepository<NodeOne> {
List<NodeOne> findByNodeTwoNodeThreeAndActiveTrue(NodeThree nodeThree);
}
My entities are the following:
#NodeEntity(label = "NodeOne")
public class NodeOne {
#GraphId
private Long id;
private Boolean active = TRUE;
#Relationship(type = "IS_ON")
private NodeTwo nodeTwo;
}
#NodeEntity(label = "NodeTwo")
public class NodeTwo {
#GraphId
private Long id;
#Relationship(type = "CONTAINS", direction = "INCOMING")
private NodeThree nodeThree;
#Relationship(type = "IS_ON", direction = "INCOMING")
private List<NodeOne> nodeOnes = new ArrayList<>();
}
#NodeEntity(label = "NodeThree")
public class NodeThree {
#GraphId
private Long id;
#Relationship(type = "CONTAINS")
private List<NodeTwo> nodeTwos = new ArrayList<>();
}
Getters & Setters omitted. When I call the method I get an empty list. Is there something I am doing incorrectly?
You didn't describe exactly what you wanted to achieve, but I can see two problems:
Problem 1:
The current version of Spring Data Neo4j and OGM only allow nested finders, that is, finders that specify a relationship property, to one depth.
Supported
findByNodeTwoSomePropertyAndActiveTrue(String relatedNodePropertyValue)
Not Supported
findByNodeTwoNodeThree //Nesting relationships in finders is not supported
Problem 2:
Derived Finders Allow Matching Properties and Nested Properties. Not a whole instance of that class.
You can probably achieve what you would like using a custom query.
#Query("custom query here")
List<NodeOne> findByNodeTwoNodeThreeAndActiveTrue(NodeThree nodeThree);
If you need help to write a custom query, you can post another question or join the neo4j-users public slack channel.

Spring JPA one to many denormalized count field

I have two entities, Books and Comments, in a one to many relationship (one book can have many comments). I want to be able to list books and number of comments about a book. I want it denormalized, meaning the books entity will have a counter that has number of comments for that book, and it will be updated every time a comment is entered (just playing with the concept, no need to discuss about the need of denormalizing here).
I think (correct me if I am wrong) this could be easily done with a trigger in the database (whenever a new comment is created, update a counter in the books table to the corresponding bookId), but for the sake of learning I want to do it through JPA, if it makes sense.
What I have so far: //omitted some annotations, just general info
Boks entity:
#Entity
public class Books {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
private String title;
private String author;
private Long numComments;
// getters and setters...
}
Comments entity:
#Entity
public class Comments {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
private String comment;
private Long authorId;
private Long bookId;
// getters and setters...
}
Books repository: I added here a query to perform the update
/**
* Spring Data JPA repository for the Books entity.
*/
public interface BooksRepository extends JpaRepository<Books,Long> {
#Modifying
#Query("UPDATE Books v SET v.numComments = v.numComments + 1 WHERE v.id = :bookId")
int updateCounter(#Param("bookId")Long bookId);
}
And now the question: What next? I think I can put the update of the Books entity annotating with #PostPersist a method of the entity Comments, but I have been unsuccessful so far. I can imagine something like this:
#PostPersist //This function in the entity Comments
protected void updateBooks() {
//Likely some call to the repository here that updates the count
// in books the info we have from current entity.
}
Any idea on how to do this? Some best practices about this kind of denormalization in JPA? Better to use the database triggers?
spring not managed your entity classes and your idea is possible but you must inject BooksRepository in enttiy class then stay at you get Nullpointerexception because spring not managed enttiy classes,The reason your BooksRepository not initlaized, try also read this post Bean injection inside a JPA #Entity and anotate entity class #Configurable after
try this
#PostPersist
protected void updateBooks(Comments comment) {
int totalComment = BooksRepository.updateCounter(comment.getBookId());
System.out.println(totalComment); // see totalComment in console
}
but good aprroach in service classes after call updateCounter when insert comment
example in your CommendService : when try a insert commend after call your updateCounter
if(comment.getBookId() != null) //Simple Control
{
CommentRepository.save(comment);
BooksRepository.updateCounter(comment.getBookId());
}

Resources