What is the correct way to save relationships in Spring Data Neo4J? - spring

Using Spring Boot and Neo4J, I've created two #NodeEntity's. They are User and Right. In my model, when you create a relationship between a User and Right, I call it a Privilege
I cannot save the #RelationshipEntity, Privilege (from within either of the #NodeEntity's or the RelationshipEntity).
Example Code
User.java (backed by interface UserRepository extends GraphRepository)
#NodeEntity
public class User {
#Autowired Neo4jTemplate template;
#GraphId Long id;
String fullName;
#Indexed(unique=true) String email;
#Fetch #RelatedTo(type="HAS_RIGHT")
Set<Right> rights;
public void addRight(Right r) {
Privilege p = new Privilege (this, r)
template.save(p) // This always throws a NullPointerException
}
/*** Getters and Setters ***/
}
Right.java (backed by interface RightRepository extends GraphRepository)
#NodeEntity
public class Right {
#GraphId Long id;
String name;
/*** Getters and Setters ***/
}
Privilege.java (Not backed by a repository interface) - PROBLEM CLASS
#RelationshipEntity(type="HAS_RIGHT")
public class Privilege {
#Autowired
Neo4jTemplate template; // This is always null
#GraphId Long id;
#StartNode User user;
#EndNode Right right;
public Privilege() {}
public Privilege(User user, Right right) {
this.user = user;
this.right = right;
}
public void save() {
template.save(this); // Always throws a NullPointerException
}
}
In my test case I can call (this works):
User user = userRepository.findByEmail("admin#noxgroup.co.za");
Right adminRight = rightRepository.findByName("ADMINISTRATOR");
Privilege adminPrivilege = new Privilege(user, adminRight);
template.save(adminPrivilege);
But I'd prefer to call (this does not work):
User user = userRepository.findByEmail("admin#noxgroup.co.za");
User.addRight (rightRepository.findByName("ADMINISTRATOR"));
But I also can't access template from within either of the NodeEntities.

You can create relationship by using Neo4jTemplate Or Neo4jOperations.
Change it to constructor injection.
#Autowired
public Constructor(Neo4jOperations operations) {
this.neo4jOperations = operations;
}
Once you have both the nodes by calling the repository, do something like:
Relation relation = neo4jOperations.createRelationshipBetween(user, right, Relationship.class, "RELATION_NAME", true);
neo4jOperations.save(relation);

My mistake was that I was instantiating the class manually, therefore it was not spring managed. Adding the #Component annotation to the class and asking Spring to give me the class resolved the problem.
#Component // This is the line that saved the day!
#RelationshipEntity(type="HAS_RIGHT")
public class Privilege {
#Autowired
Neo4jOperations operations;
#GraphId Long id;
#StartNode User user;
#EndNode Right right;
public Privilege() {
;;
}
public void createRelationship(User user, Right right) {
this.user = user;
this.right = right;
this.save();
}
/*** Getters and Setters ***/
}
Then to instantiate it:
...
#Autowired ApplicationContext applicationContext;
...
Privilege privilege = applicationContext.getBean(Privilege.class);
privilege.createRelationship(user, ADMINISTRATOR);

Related

CascadeType Merge is ignored when Persist is set

Hy all
I'm having a hard time solving the following spring jpa problem.
Let's say I have the following simple data model (two entities with a one direction relationship between the two)
#Accessors(chain = true) #Getter #Setter #NoArgsConstructor #AllArgsConstructor
#MappedSuperclass
public class AbstractEntity {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
#Version
private Long version;
}
#Accessors(chain = true) #Getter #Setter #NoArgsConstructor #AllArgsConstructor
#Entity
public class Entity1 extends AbstractEntity {
private String name;
}
#Accessors(chain = true) #Getter #Setter #NoArgsConstructor #AllArgsConstructor
#Entity
public class Entity2 extends AbstractEntity {
private String name;
#ManyToOne(cascade={ALL})
private Entity1 entity1;
}
and the following plumbing to store them
public interface Entity1Dao extends JpaRepository< Entity1, Long >, JpaSpecificationExecutor< Entity1 > {
Entity1 findByName(String name);
}
public interface Entity2Dao extends JpaRepository< Entity2, Long >, JpaSpecificationExecutor< Entity2 > {
Entity2 findByName(String name);
}
#Service
public class StoreService {
#Autowired
Entity1Dao dao1;
#Autowired
Entity2Dao dao2;
#Transactional
public Entity1 saveEntity1(Entity1 e) {
return dao1.save(e);
}
#Transactional
public Entity2 saveEntity2(Entity2 e) {
return dao2.save(e);
}
public Entity1 loadEntity1ByName(String name) {
return dao1.findByName(name);
}
}
#SpringBootApplication
public class JpaDemoApplication {
public static void main(String[] args) {
SpringApplication.run(JpaDemoApplication.class, args);
}
}
And the following test
#SpringBootTest
#TestMethodOrder(value = MethodOrderer.OrderAnnotation.class)
class JpaDemoApplicationTests {
#Autowired
StoreService store;
#Test
#Order(1)
void contextLoads() {
assertThat(store).isNotNull();
}
#Test
#Order(2)
void insertEntity1() {
store.saveEntity1(new Entity1("test entity1"));
Entity1 saved = store.loadEntity1ByName("test entity1");
assertThat(saved).isNotNull().hasNoNullFieldsOrProperties();
}
#Test
#Order(4)
void insertEntity2WithNewEntity1() {
store.saveEntity2(new Entity2("with new entity1", new Entity1("new entity1")));
}
#Test
#Order(5)
void insertEntity2WithExistingEntity1() {
store.saveEntity2(new Entity2("with saved entity1", store.loadEntity1ByName("test entity1")));
}
}
the last test (i.e. insertEntity2WithExistingEntity1) fails with the following exception
org.hibernate.PersistentObjectException: detached entity passed to
persist: com.example.jpaDemo.Entity1
If I change the CascadeType in Entity2 to MERGE, that test passes but the insertEntity2WithNewEntity1 fails with the following exception
org.hibernate.TransientPropertyValueException: object references an
unsaved transient instance - save the transient instance before
flushing : com.example.jpaDemo.Entity2.entity1 ->
com.example.jpaDemo.Entity1
I've tried multiple combination of cascading types bute it seems that as soon as PERSIST is used, the last test fails (and ALL includes PERSIST).
I would have expected that if MERGE and PERSIST are set, they would both be active but form the test it looks like MERGE is ignored when PERSIST is set.
Any clues, tips, hints at what I'm doing wrong so that both tests run???
EDIT
The tests are suppose to mimick the behaviour of a REST service endpoint reveiving and saving json reprensentation of an Entity1.
The json for the third test would be
{ name: "with new entity1", entity1: { name: "new entity1"}}
The json for the fourth would be
{ name: "with new entity1", entity1: { id: 1, version: 0, name: "test entity1"}}
JPA should persists the entity1 in the third test because it's id is null but should merge the one in the fourth test because it's id is not null.
I am however unable to do both, it's either one or the other.
EDIT 2
I've modified Entity1 slightly to have a reference to the list of Entity2 associated to it and annotated it with #OneToMany and the same cascading type as in Entity2 and it's the same behavior.
When I set the cascading type to MERGE and only Merge, I'm able to save a new entity that has a reference with an existing one but I can't save a new entity with a reference to a new one.
When I set the cascading type to PERSIST (i.e PERSIST on its own, PERSIST and MERGE or ALL), it's the oppposit; I can save a new entity with a reference to anther new entity but I can't save a new entity with a reference to an already existing one.
So it's seem that when PERSIST is set, it overrides the behavior of MERGE. That, to me, is a bug. Is it not?
I've uploaded the source to github in case you want to experiment or take a look at it yourself. https://github.com/willix71/persistVsMerge.git
You need to add #Transactional on your last test. The entity loaded is detached as there is no outer transaction, you can't persist it.
#Test
#Order(5)
#Transactional
void insertEntity2WithExistingEntity1() {
store.saveEntity2(new Entity2("with saved entity1", store.loadEntity1ByName("test entity1")));
}
I'm not sure if this is relevant anymore, but the code below works as I would expect. Removing "cascade = CascadeType.PERSIST" will fail the persist test with "object references an unsaved transient instance".
I also noticed in your github repo that you are attempting to do cascading both from parent to child and child to parent. I think this is the root cause of your issues.
Entities:
#Entity
#Table(name = "users")
#Getter
#Setter
#NoArgsConstructor
public class User {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
UUID id;
#ManyToOne(cascade = CascadeType.PERSIST)
Address address;
}
#Entity
#Getter
#Setter
#NoArgsConstructor
public class Address {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
Long id;
#OneToMany(mappedBy = "address")
List<User> user;
}
Repositories:
public interface UserRepository extends JpaRepository<User, UUID> {
}
public interface AddressRepository extends JpaRepository<Address, UUID> {
}
Tests:
#DataJpaTest
#Import(DataSourceConfig.class)
class UserRepositoryTest {
#Autowired
private UserRepository userRepository;
#Autowired
private AddressRepository addressRepository;
#Test
void testMerge() {
var address = new Address();
addressRepository.save(address);
var user = new User();
user.setAddress(address);
userRepository.save(user);
assertThat(userRepository.findAll()).contains(user);
assertThat(addressRepository.findAll()).contains(address);
}
#Test
void testPersist() {
var address = new Address();
var user = new User();
user.setAddress(address);
userRepository.save(user);
assertThat(userRepository.findAll()).contains(user);
assertThat(addressRepository.findAll()).contains(address);
}
}

Populate #Transient field within Spring Data Jpa Repository

Is possible to populate transient field in entity class with Spring Data REST api somehow (by projection or something) - to get that value in JSON response ? I need to populate for example info field with value got from second datasource (i had Spring repo bean for this datasource and need inject it in something like "interceptor" and fill that field).
#Entity
public class User {
#Id
private Long id;
#Transient
private String info;
// getters & setters
}
public interface UserRepository extends JpaRepository<User, Long> {
}
I found solution using PostLoadEventListener, but it is for Hibernate, not exactly what i was looking for, but works. I think it should be more general Spring-ly solution.
#Component
public class UserInterceptor implements PostLoadEventListener {
#Autowired
private SecondRepository repo;
#Autowired
#Qualifier("prmiaryEntityManagerFactory")
private EntityManagerFactory entityManagerFactory;
#PostConstruct
private void init() {
HibernateEntityManagerFactory hibernateEntityManagerFactory = (HibernateEntityManagerFactory) this.entityManagerFactory;
SessionFactoryImpl sessionFactoryImpl = (SessionFactoryImpl) hibernateEntityManagerFactory.getSessionFactory();
EventListenerRegistry registry = sessionFactoryImpl.getServiceRegistry().getService(EventListenerRegistry.class);
registry.appendListeners(EventType.POST_LOAD, this);
}
#Override
public void onPostLoad(PostLoadEvent event) {
final Object entity = event.getEntity();
if(entity != null && entity instanceof User) {
User user = (User) entity;
// populate using another repo bean
Info s = repo.findOne(user.getInfoId());
user.setInfo(s.getName());
}
}
}

Spring Boot + JPA + Lazy bi-directional collection testing

I'm new to JPA and Spring Data and I've faced problem with testing my application. I have two entities in my Spring boot application:
#Getter
#Setter
#Entity
public class User extends SomeBasicEntity {
#NotNull
#Column(unique = true)
private String login;
#NotNull
private String password;
#OneToMany(fetch = FetchType.LAZY, mappedBy = "user", cascade = CascadeType.ALL)
#JsonManagedReference
private List<Role> roles;
}
#Getter
#Setter
#Entity
#Table(name = "ROLE")
public class Role extends SomeBasicEntity {
#NotNull
private String name;
#ManyToOne
#NotNull
#JsonBackReference
private User user;
}
I have implemented dedicated JPA repositories for them (as JpaRepository).
I have also implemented transactional facade for the management:
#Service
#Transactional
public class Facade {
#Autowired
private UserRepository userRepository;
#Autowired
private RoleRepository roleRepository;
public User addNewUser(User user) {
return userRepository.save(user);
}
public Role addNewRole(Role role, User user) {
role.setUser(user);
return roleRepository.save(role);
}
public User findUserByLogin(String login) {
User user = userRepository.findByLogin(login);
if (user != null) {
return user;
} else {
throw new FacadeRuntimeException("User " + login + " does not exists");
}
}
}
Finally, I've created RestController for using the facade (I've hardcoded login name value just for test purposes):
#RestController
#RequestMapping(value = "/users")
public class UserController {
#Autowired
private Facade facade;
#RequestMapping(value = "/login", method = RequestMethod.GET)
public User getUserByLogin() {
User user = facade.findUserByLogin("jahu");
return user;
}
}
And now when I run the application, use facade methods to create user and role (code is irrelevant so I dont paste it here) and check REST response on localhost/users/login there is properly serialized User object WITH list of roles (with single role I've created actually).
Question 1: Why roles are present in Json object whereas I didn't explictly call getRoles on the user object? With lazy fetch type I believe collection should be retrived only when getter is called (maybe JSON serializer is calling the method?)
Nevertheless, I have second problem as I want to test my facade with jUnit (on h2 db), however in test method the roles on User object are always null (even if I explicltly call getter):
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = FacadeApplication.class)
#Transactional
public class FacadeTest {
private static final String USER_LOGIN = "jahu";
private static final String ROLE_NAME = "role name";
#Autowired
Facade sut;
#Before
public void setUp() throws Exception {
User user = new User();
user.setLogin(USER_LOGIN);
user.setPassword("jahu");
user = sut.addNewUser(user);
Role role = new Role();
role.setName(ROLE_NAME);
sut.addNewRole(role, user);
}
#Test
public void shouldFindUserRoles() {
User user = sut.findUserByLogin(USER_LOGIN);
assertThat(user).isNotNull();
assertThat(user.getLogin()).isEqualTo(USER_LOGIN);
List<Role> roles = user.getRoles(); // HERE I call the getter
assertThat(roles).isNotNull().isNotEmpty();
}
}
Question 2: Why can I not access roles in test context even though I am using the same method as in RestController (where roles are always obtained)? Without it I am not able to fully test my app and it concerns me very much.
Note: entity names are just for description of the problem, so please don't suggest that I should remodel my entities in order to assign Role to User.

Generic DAO design-pattern with inheritance. is this a good design?

I just want to have a comment for what I've learned from dozens of samples about Generic DAO design-pattern. I added an inheritance hierarchy between POJO classes, DAO interfaces, and DAO implementations please see codes below
Legend:
DAOs (From Parent to children)
DAO implementations (From Parent to Children)
POJO classes (From Parent to Children)
The Data Acess Objects (Interfaces)
The GenericDAO interface
public interface GenericDAO<T> {
... some crud operations common to all objets
}
The PersonDAO interface
public interface PersonDAO<T extends Person> extends GenericDAO<T> {
... some operations unique to a person
}
The StudentDAO interface
public interface StudentDAO extends PersonDAO<Student> {
... some operations unique to a student
}
The Implementations
The GenericDAO Implementation
#Repository("genericDAO")
public class GenericDAOImpl<T extends Person> implements GenericDAO<T> {
private Class<T> type;
#SuppressWarnings("unchecked")
public GenericDAOImpl() {
this.type = (Class<T>) GenericTypeResolver.resolveTypeArgument(getClass(), GenericDAO.class);
System.out.println(type);
}
#Resource(name = "sessionFactory")
protected SessionFactory sessionFactory;
#Transactional
#Override
public Integer save(T entity) {
return (Integer) sessionFactory.getCurrentSession().save(entity);
}
#SuppressWarnings("unchecked")
#Transactional
#Override
public T get(Integer id) {
return (T) sessionFactory.getCurrentSession().get(type, id);
}
}
The PersonDAO implementation
#Repository ("personDAO")
public class PersonDAOImpl<T extends Person> extends GenericDAOImpl<T> implements PersonDAO<T> {
.. implemented methods for person
}
The StudentDAO implementation
#Repository("studentDAO")
public class StudentDAOImpl extends PersonDAOImpl<Student> implements StudentDAO {
.. implemented methods for student
}
The POJO Classes (Hibernate Annotated)
The Person Class (Parent Abstract Class)
#MappedSuperclass
public abstract class Person {
#Id
#GeneratedValue
#Column (name = "id")
private int id;
#Column (name = "name")
private String name;
#Column (name = "age")
private int age;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
The concrete class (Student)
#Entity
#Table(name = "STUDENT")
public class Student extends Person {
#Column(name = "school")
private String school;
public Student() {
}
public Student(String school) {
this.school = school;
}
public String getSchool() {
return school;
}
public void setSchool(String school) {
this.school = school;
}
}
I've been thinking about how am I going to construct a design-pattern between POJOs and DAO objects for days, Until I've come up with these design based on everything I've learned from different resources around the web. I've come up with the idea of DAO and DAO implementation inheritance based on the inheritance of the POJOs.
is this a good practice? reflecting the hierarchy of the POJOs and do it in DAOs?
am I doing something wrong about here with my design? because I have a complete program that
saves and retrieves my objects from the database without any problem
I'm open to any suggestion or corrections. Thank you in advance!!!
Not a comment on the design, but... have you consider using Spring Spring Data Jpa, which allows you to:
write your repository interfaces, including custom finder methods, and Spring will provide the implementation automatically.

Spring Data Neo4J #Indexed(unique = true) not working

I'm new to Neo4J and I have, probably an easy question.
There're NodeEntitys in my application, a property (name) is annotated with #Indexed(unique = true) to achieve the uniqueness like I do in JPA with #Column(unique = true).
My problem is, that when I persist an entity with a name that already exists in my graph, it works fine anyway.
But I expected some kind of exception here...?!
Here' s an overview over basic my code:
#NodeEntity
public abstract class BaseEntity implements Identifiable
{
#GraphId
private Long entityId;
...
}
public class Role extends BaseEntity
{
#Indexed(unique = true)
private String name;
...
}
public interface RoleRepository extends GraphRepository<Role>
{
Role findByName(String name);
}
#Service
public class RoleServiceImpl extends BaseEntityServiceImpl<Role> implements
{
private RoleRepository repository;
#Override
#Transactional
public T save(final T entity) {
return getRepository().save(entity);
}
}
And this is my test:
#Test
public void testNameUniqueIndex() {
final List<Role> roles = Lists.newLinkedList(service.findAll());
final String existingName = roles.get(0).getName();
Role newRole = new Role.Builder(existingName).build();
newRole = service.save(newRole);
}
That's the point where I expect something to go wrong!
How can I ensure the uniqueness of a property, without checking it for myself??
P.S.: I'm using neo4j 1.8.M07, spring-data-neo4j 2.1.0.BUILD-SNAPSHOT and Spring 3.1.2.RELEASE.
I walked into the same trap... as long as you create new entities, you will not see the exception - the last save()-action wins the battle.
Unfortunately, the DataIntegrityViolationException will be raised only in case of update an existing entity!
A detailed description of that behaviour can be found here:
http://static.springsource.org/spring-data/data-graph/snapshot-site/reference/html/#d5e1035
If you are using SDN 3.2.0+ use the failOnDuplicate attribute:
public class Role extends BaseEntity
{
#Indexed(unique = true, failOnDuplicate = true)
private String name;
...
}

Resources