I'm migrating a legacy application from Spring-core 4 to Springboot 2.5.2.
The application is using spring-data-rest (SDR) alongside spring-data-mongodb to handle our entities.
The legacy code was overriding SDR configuration by extending the RepositoryRestMvcConfiguration and overriding the bean definition for persistentEntityJackson2Module to remove serializerModifier and deserializerModifier.
#EnableWebMvc
#EnableSpringDataWebSupport
#Configuration
class RepositoryConfiguration extends RepositoryRestMvcConfiguration {
...
...
#Bean
#Override
protected Module persistentEntityJackson2Module() {
// Remove existing Ser/DeserializerModifier because Spring data rest expect linked resources to be in href form. Our platform is not tailored for it yet
return ConverterHelper.configureSimpleModule((SimpleModule) super.persistentEntityJackson2Module())
.setDeserializerModifier(null)
.setSerializerModifier(null);
}
It was to avoid having to process DBRef as href link when posting entities, we pass the plain POJO instead of the href and we persist it manually before the entity.
Following the migration, there is no way to set the same overrided configuration but to avoid altering all our processes of creation we would like to keep passing the POJO even for DbRef.
I will add an exemple of what was working before :
We have the entity we want to persist :
public class EntityWithDbRefRelation {
....
#Valid
#CreateOnTheFly // Custom annotation to create the dbrefEntity before persisting the current entity
#DBRef
private MyDbRefEntity myDbRefEntity;
}
the DbRefEntity
public class MyDbRefEntity {
...
private String name;
}
and the JSON Post request we are doing:
POST base-api/entityWithDbRefRelations
{
...
"myDbRefEntity": {
"name": "My own dbRef entity"
}
}
In our database this request create our myDbRefEntity and then create the target entityWithDbRefRelation with a dbRef linked to the other entity.
Following the migration, the DBRef is never created because when deserializing the JSON into a PersistingEntity, the myDbRefEntity is ignored because it's expecting an href instead of a complex object.
I see 3 solutions :
Modify all our process to first create the DBRef through one request then create our entity with the link to the dbRef
Very costly as we have a lot of services creating entities through this backend
Compliant with SDR
Define our own rest mvc controllers to do operations, to ignore the SDR mapping machanism
Add AOP into the RepositoryRestMvcConfiguration around the persistentEntityJackson2Module to set le serializerModifier and deserializedModifier to null
I really prefer to avoid this solution as Springboot must have remove a way to configure it on purpose and it could break when migrating on newer version
Does anyone know a way to continue considering the property as a complex object instead of an href link except from my 3 previous points ?
Tell me if you need more information and thanks in advance for your help!
Related
I need to expose reactive end-points i.e Flux/Mono in Spring+Java. But I
don't want to use Entity class as the definition of Entity class may keep on changing
and we can have dynamic need to register new Entity classes.
Is there any way we can implement Spring Reactive end-points without Entity class.
I am using Spring+Java and Mongodb.
The Spring framework relies on entities, whether reactive or not. It basically doesn’t affect you because you need to have knowledge of the document to reference a key value. What is not in the entity will not be set for it. If the element does not exist but the entity does, NULL is set.
If you use Kotliin, I recommend using nullable values like the "?" symbol if not guaranteed to non-null.
Side note: How would you like to do anything if you don't know what you're storing?
I got a solution that as follows:
You can use ReactiveMongoTemplate. For example:
#Autowired
private ReactiveMongoTemplate mongoTemplate;
public Flux<Document> findAll() {
return mongoTemplate.findAll(Document.class,"employee");
}
public Mono<Document> save(Document data){
return mongoTemplate.save(data,"employee");
}
So Instead of passing any Entity Class, you can use Document.class
i started a project on spring boot using a rest a webservice, when i shared it between my team they puted some comments :
get method need to be grouped Ex : get/users & get/users/{id} will be get/users/{id}
remove put method & just use post Ex: post/users/0 add | post/users/{id} update
make a helper class for Jdbc Template and call it in the repository classes to centralize the code
pls guys help me to solve this i'm so confused, and thank you
get method need to be grouped Ex : get/users & get/users/{id} will be
get/users/{id}
I do not agree with this. /get/users will be returning List<User> and get/users/{id} will return User that matches with {id}
remove put method & just use post Ex: post/users/0 add |
post/users/{id} update
Post should be used when you create a new resource. POST is not idempotent. Each time you call a post a new resource will be created.
e.g. Calling POST /Users will create a new User every-time.
PUT on other hands works like upsert. Create if the resource is not present and update/replace if present. Put is idempotent and doesn't change the resource's state even if it's called multiple times.
make a helper class for Jdbc Template and call it in the repository
classes to centralize the code
Helper classes help to separate the concerns and achieve single responsibility.
However, JdbcTemplate is a ready to use abstraction of JDBC. I don't see any point in creating Helper. You can create a DataAccessObject (DAO) or Repository which has-a JdbcTemplate. Like the two Dao shown below
public class UserDao {
#Autowired
private JdbcTemplate jdbcTemplate;
public User findUserById(String id){}
public void addUser(User user){}
}
// -------
public class BooksDao{
#Autowired
private JdbcTemplate jdbcTemplate;
public List<Book> getAllBooksByType(String type){}
public void Book getBookByName(String name){}
}
Now, your Dao objects can be called from Controller or if you need to modify data before/after DB operation, best is to have a Service layer between Controller and Dao.
Don't bother too much about recommendations or rules. Stick to the basic OOPS concepts. Those are really easy to understand and implement.
Always:
Encapsulate data variables and methods working on those variables together
Make sure your class has a Single Responsibility
Write smaller and testable methods (if you can't write tests to cover your method, then something is wrong with your method)
Always keep the concerns separate
Make sure your objects are loosely coupled. (You are already using spring so just use the spring's auto-wiring)
I have a User class and I want to authorize access such that only a user gets to see what he is entitled to.
This was easily achievable using Spring Security in conjunction with Spring Data Rest where in JPA Repository I did below -
public interface UserRepository extends JPARepository<User,Integer> {
#PreAuthorize("hasRole('LOGGED_IN') and principal.user.id == #id")
User findOne(#Param("id") Integer id);
}
In this way, a user when visits to Spring Data REST scaffolded URLs like -
/users/{id}
/users/{id}/userPosts
Only those logged in with {id} get to see these and everyone else gets 401 like I would have wanted.
My problem is that I have one of Projections which is a public view of each user and I am crating it using Spring Data Rest projections as below which I want to be accessible for every {id}
#Projection(name = "details", types = User.class)
public interface UserDetailsProjection {
..
}
So, /users/{id1}?projection=details as well as /users/{id2}?projection=details should give 200 OK and show data even though user is logged in by {id1}
I began implementing this by marking projection with #PreAuthorize("permitAll") but that won't work since Repository has harder security check. Can we have this functionality where for a projection we can relax security ?
I am using latest Spring Data Rest and Spring Security distributions
Seems reasonable to add a custom controller for this use-case.
Please also consider:
Evaluate access in projections using #Value annotations
Add another entity for the same database data but with different field set for read-only operations, e.g. using inheritance (be careful with caching, etc.) - depends on your data storage type
Modify model to split User entity into two different entities (profile, account) since they seem to have different access and possibly even operations
You can also add a ResourceProcessor<UserSummaryProjection> to evaluate access programmatically and replace resource content (projection) with a DTO
Example of evaluating access in projections with #Value annotations:
#Projection(types = User.class, name = "summary")
public interface UserSummaryProjection {
#Value("#{#userSecurity.canReadEmail(target) ? target.email: null}")
String getEmail();
}
Added spring security code in the data access layer is not a good idea. I would suggest you to add the #PreAuthorize annotation to the controller/service method. Since you have a query parameter, ?projection=details, you can have separate controller/service method for the details projection.
Add following to your details projection method:
#RequestMapping("/url", params = {"projection"})
#PreAuthorize("hasRole('LOGGED_IN') and principal.user.id == #id")
In Spring we've got #ExposesResourceFor annotation which can link our resource with other resources. Thanks to this our Value objects (representations) can know nothing of the actual resources.
Is there a way to do it in JAX-RS? I'm using Dropwizard with Jersey and Jackson and all I see is #InjectLinks annotation which I can use in a value object like this:
public class UserGroup {
#JsonProperty
public String name;
#InjectLinks(GroupsResource.class)
public URI myResource;
public UserGroup(String name){
this.name = name;
}
}
But unfortunatelly my Value Objects should know nothing about Resources, so I'm asking can I do such linking on the level of resources - link in spring-hateoas in controllers, as mentioned above.
With #InjectLinks, you don't have to declare the links in your model class. You can create a "wrapper" representation class, as shown in declarative-linking from the Jersey examples (though this solution is not really on the resource class level as you wish).
Another possible solution (rather than declarative linking) is to use the JAX-RS 2.0 Link class, and do the linking programmatically (with no ties to the Jersey implementation/annotations). You can either add the links to your response headers, as see here, or add Links to you model classes, as seen here (or use the wrapper class for this also, so as to not to invade your model classes)
Some Resources
Declarative Hyperlinking
Using Link for headers
I have started to use SDN 3.0.0 M1 with Neo4j 2.0 (via rest interface) and I want use an existing graph.db with existing datas.
I have no problem to find node created through SDN via hrRepository.save(myObject); but I can't fetch any existing node (not created through SDN), via hrRepository.findAll(); or any other method, despite I have manually added a property __type__ in this existing nodes.
I use a very simple repository to test that :
#Component
public interface HrRepository extends GraphRepository<Hr> {
Hr findByName(String name);
#Query("match (hr:hr) return hr")
EndResult <Hr> GetAllHrByLabels();
}
And the named query GetAllHrByLabels work perfectly.
Is an existing way to use standard methods (findAll() , findByName()) on existing datas without redefine Cypher query ?
I recently ran into the same problem when upgrading from SDN 2.x to 3.0. I was able to get it working by first following the steps in this article: http://maxdemarzi.com/2013/06/26/neo4j-2-0-is-coming/ to create and enable Neo4j Labels on the existing data.
From there, though, I had to get things working for SDN 3. As you encountered, to do this, you need to set the metadata correctly. Here's how to do that:
Consider a #NodeEntity called Person, that inherits from AbstractNodeEntity (imports and extraneous code removed for brevity):
AbstractNodeEntity:
#NodeEntity
public abstract class AbstractNodeEntity {
#GraphId private Long id;
}
Person:
#NodeEntity
#TypeAlias("Person") // <== This line added for SDN 3.0
public class Person extends AbstractNodeEntity {
public String name;
}
As you know, in SDN 2.x, a __type__ property is created automatically that stores the class name used by SDN to instantiate the node entity when it's read from Neo4j. This is still true, although in SDN 3.0 it's now specified using the #TypeAlias annotation, as seen in the example above. SDN 3.0 also adds new metadata in the form of Neo4j Labels representing the class hierarchy, where the node's class is prepended with an underscore (_).
For existing data, you can add these labels In Cypher (I just used the new web-based Browser utilty in Neo4j 2.0.1) like this:
MATCH (n {__type__:'Person'}) SET n:`_Person`:`AbstractNodeEntity`;
Just wash/rinse/repeat for other #NodeEntity types you have.
There is also a Neo4j Label that gets created called SDN_LABEL_STRATEGY but it isn't applied to any nodes, at least in my data. SDN 3 must have created it automatically, as I didn't do so manually.
Hope this helps...
-Chris
Using SDN over REST is probably not the best idea performance-wise. Just that you know.
Data not created with SDN won't have the necessary meta information.
You will have to iterate over the nodes manually and use
template.postEntityCreation(Node,Class);
on each of them to add the type information. Where class is your SDN annotated entity class.
something like:
for (Node n : template.query("match(n) where n.type = 'Hr' return n").to(Node.class))
template.postEntityCreation(n,Hr.class);