I'm trying to use GWT + Spring + Hibernate
When lunshing the application I get this error:
com.google.gwt.user.client.rpc.SerializationException: Type 'org.hibernate.collection.PersistentBag' was not included in the set of types which can be serialized by this SerializationPolicy or its Class object could not be loaded. For security purposes, this type will not be serialized.: instance = [com.asso.shared.model.Activite#64d6357a]
after using this method with the lists of the persistence classes:
public static <T> ArrayList<T> makeGWTSafe(List<T> list) {
if(list instanceof ArrayList) {
return (ArrayList<T>)list;
} else {
ArrayList<T> newList = new ArrayList<T>();
newList.addAll(list);
return newList;
}
}
with my lists I got this:
com.google.gwt.user.client.rpc.SerializationException: Type 'org.hibernate.collection.PersistentBag' was not included in the set of types which can be serialized by this SerializationPolicy or its Class object could not be loaded. For security purposes, this type will not be serialized.: instance = [com.asso.shared.model.Personne#75a2fb58]
==========================================
I have searched in the other subjects but I can't find any solution!
How can I solve this serialization thing!?
I'm using List in my Persistence classes
You need to send DTO object to client side (instead of original one backed by Hibernate). The problem is that your Personne object is actually a Hibernate proxy. Each time when you call some method on it Hibernate do some work (fetch collections from DB for example). There is no simple way to serialize such kind of objects.
Hibernate entities:
//Hibernate entity
public class Personne {
private String name;
private List<Address> addresses;
}
//Hibernate entity
public class Address {
}
Corresponding DTO objects:
public class PersonneDto {
private String name;
private List<AddressDto> addresses;
}
public class AddressDto {
}
Instead of sending Personne to client side you need to create new PersonneDto object, copy state to it and then send to UI. Personne cannot be used in client side because Personne.getAddresses() in most cases hit DB to fetch data (which is inpossible to do in client side JS). So each Personne must be replaced by PersonneDto on client side. As a downside you need to mantnain additional layaer of DTO objects and corresponding code to transform entities to DTOs. There are another approaches to this problem. See this article for more details.
Related
I'm using spring-data-mongodb at the moment so this question is primarily in context of MongoDB but I suspect my question applies to repository code in general.
Out of the box when using a MongoRepository<T, ID> interface (or any other Repository<T, ID> descendent) the entity type T is expected to be the document type (the type that defines the document schema).
As a result injecting such a repository into service component means this repository is leaking database schema information into the service tier (highly pseudo) :
class MyModel {
UUID id;
}
#Document
class MyDocument {
#Id
String id;
}
interface MyRepository extends MongoRepository<MyDocument, String> {
}
class MyService {
MyRepository repository;
MyModel getById(UUID id) {
var documentId = convert(id, ...);
var matchingDocument = repository.findById(documentId).orElse(...);
var model = convert(matchignDocument, ...);
return model;
}
}
Whilst ideally I'd want to do this :
class MyModel {
UUID id;
}
#Document
class MyDocument {
#Id
String id;
}
#Configuration
class MyMagicConversionConfig {
...
}
class MyDocumentToModelConverter implements Converter<MyModel, MyDocument> {
...
}
class MyModelToDocumentConverter implements Converter<MyDocument, MyModel> {
...
}
// Note that the model and the model's ID type are used in the repository declaration
interface MyRepository extends MongoRepository<MyModel, UUID> {
}
class MyService {
MyRepository repository;
MyModel getById(UUID id) {
// Repository now returns the model because it was converted upstream
// by the mongo persistence layer.
var matchingModel = repository.findById(documentId).orElse(...);
return matchingModel ;
}
}
Defining this conversion once seems significantly more practical than having to consistently do it throughout your service code so I suspect I'm just missing something.
But of course this requires some way to inform the mongo mapping layer to be aware of what conversion has to be applied to move between MyModel and MyDocument and to use the latter for it's actual source of mapping metadata (e.g. #Document, #Id, etc.).
I've been fiddling with custom converters but I just can't seem to make the MongoDB mapping component do the above.
My two questions are :
Is it currently possible to define custom converters or implement callbacks that allow me to define and implement this model <-> document conversion once and abstract it away from my service tier.
If not, what is the idiomatic way to approach cleaning this up such that the service layer can stay blissfully unaware of how or with what schema an entity is persisted? A lot of Spring Boot codebases appear to be fine with using the type that defines the database schema as their model but that seems supoptimal. Suggestions welcome!
Thanks!
I think you're blowing things a bit out of proportion. The service layer is not aware of the schema. It is aware of the types returned by the repository. How the properties of those are mapped onto the schema, depends on the object-document mapping. This, by default, uses the property name, as that's the most straightforward thing to do. That translation can either be customized using annotations on the document type or by registering a FieldNamingStrategy with Spring Data MongoDB.
Spring Data MongoDB's object-document mapping subsystem provides a lot of customization hooks that allows transforming arbitrary MongoDB documents into entities. The types which the repositories return are your domain objects that - again, only by default - are mapped onto a MongoDB document 1:1, simply because that's the most reasonable thing to do in the first place.
If really in doubt, you can manually implement repository methods individually that allow you to use the MongoTemplate API that allows you to explicitly define the type, the data should be projected into.
You can use something like MapStruct or write your own Singleton Mapper.
Then create default methods in your repository:
interface DogRepository extends MongoRepository<DogDocument, String> {
DogDocument findById(String id);
default DogModel dogById(String id) {
return DogMapper.INSTANCE.toModel(
findById(id)
);
}
}
I have a question concerning the representation model processors of Spring HATEOAS. We are experimenting to process models before serializing them to the client. Our use case is to enrich the imageUrl field of UserModel objects at runtime, as we have to build the URL based on values from a config bean (AWS S3 bucket URL differs for DEV / PROD setup).
#Data
public class UserModel {
// ...
private String imageUrl;
}
Therefore, we create a UserProcessor to implement this:
public class UserProcessor implements RepresentationModelProcessor<EntityModel<UserModel>> {
private final ConfigAccessor configAccessor;
public UserProcessor(ConfigAccessor configAccessor) {
this.configAccessor = configAccessor;
}
#Override
public EntityModel<UserModel> process(EntityModel<UserModel> model) {
if (model.getContent() != null)
// do the enrichment and set "imageUrl" field
}
return model;
}
}
This works perfectly if we have a controller method like this:
#ResponseBody
#GetMapping("/me")
public EntityModel<UserModel> getCurrentUser(#AuthenticationPrincipal Principal principal) {
UserModel user = ... // get user model
return EntityModel.of(user);
}
However, we are struggling now with the enrichment whenever a UserModel is referenced in another model class, e.g., the BookModel:
#Data
public class BookModel {
private String isbn;
// ...
private EntityModel<UserModel> user; // or "private UserModel user;"
}
A controller method returning type EntityModel<BookModel> only applies the processor for its type, but not for types that are referenced. It seems the processors are not applied recursively.
Is this intentional or are we doing something wrong?
Thanks for any input and help,
Michael
I encountered the same issue and I resolved it by manually assembling resources, in your case that would be implementing RepresentationModelAssembler of the BookModel and then manually invoking the processor on the userModel object that is inside the book.
Make the outer resource a representation model
First consider the BookModel to extend RepresentationModel so that you can manually add links and assemble inner resources (which you would like for the EntityModel<UserModel> object)
#Data
public class BookModel extends RepresentationModel<BookModel> {...}
Write a model assembler
Now write the assembler that takes your book entity and transforms it into a representation model or a collection of these models. You will implement here what EntityModel.of(...) does for you automagically.
#Component
public class BookModelAssembler implements RepresentationModelAssembler<Book, BookModel> {
#Autowired
private UserProcessor userProcessor;
#Override
public BookModel toModel(Book entity) {
var bookModel = new BookModel(entity) // map fields from entity to model
// Transform the user entity to an entity model of user
var user = entity.getUser();
EntityModel<UserModel> userModel = EntityModel.of(user);
userModel = userProcessor.process(userModel);
bookModel.setUserModel(userModel);
return bookModel;
}
}
I might be going out on a limb but I suppose the reason for this is that the processors get invoked when an MVC endpoint returns a type that has a registered processor, which in the case of embedded types is not invoked. My reasoning is based on the docs for RepresentationModelProcessor, which states that processor processes representation models returned from Spring MVC controllers.
Right now, I have an #Entity say Car that has a certain set of attributes. This gets persisted into a database.
Now, in the #RestController, if I want to accept a Car parameter except for certain properties, how do I do that? Right now, I am creating a different class called CarInput that is the same as Car minus those properties.
Again, for REST API response, same thing. If I want to return a Car but with a certain field removed. Right now I created CarResponse as a model.
Is there a cleaner way to do this?
I'd make the case that your external representation and your internal storage should hardly ever be identical. Sure, there'll be significant overlap, but your database and your API should be as independent from each other as possible.
I'd say it's a good practice to have separate domain models for the model and view layer (read: two different Car classes, in different packages). You can use a mapping framework like Dozer or MapStruct to map back and forth between these different entity types.
There are two common approaches to such problem.
Use #JsonIgnore on fields/getters that you want to exclude. However, this can lead to annotation hell or generally hard to read code.
#JsonIgnore
private String password;
Create a DTO class that data would be deserialized from or serialized to. What I mean is that when some user makes a POST request with a car definition, it would be deserialized by spring to CarDto and then parsed by you in the service layer to the Car object which you could save to a database. Similarly, Car object would be parsed to CarDto if the user asks for a data.
#GetMapping("/{userId}")
UserDto getUser(#PathVariable Long userId) {
return userService.getUser(userId);
}
#PostMapping
UserDto addUser(#RequestBody UserDto userDto) {
return userService.createUser(userDto);
}
This one, on the other hand, could lead to a situation where you sometimes use a Dto and sometimes the class itself. Because of that, consider parsing to/from CarDto only in the controller layer (unlike in the example above)
Also it's good to avoid creating two classes in one file. It makes hard to find a desired class afterwards.
You can still avoid of using a DTO class.
When you post Car object to controller your can control the wanted properties and operate on it.
For selecting fields to return as the response you can use json views.
Entity :
public Car {
private String color;
#JsonView(Views.Public.class)
private Integer weight;
// getters, setters
}
Controller :
#RestController
public CarController
#Autowired
private CarRepository carRepository;
#GetMapping("/{id}")
#JsonView(View.Public.class)
public Book get(#PathVariable Long id){
return carRepository.findOne(id);
}
#PostMapping
public Book update(#RequestBody Car car) {
// only properties we want to update
if(car.getColor() != null) {
// save in database or other operations
}
}
}
View :
public class Views {
public static class Public {
}
}
This way the controller's method "get" will send to client only "weight" property and "update" method will operate only on selected properties.
Say I have a class structure as follows, it is pretty basic inheritance:
Manager extends Person {
private String name;
Manager() {
}
}
Clerk extends Person {
private String salary;
}
In spring Data if I store these in Mongo, is it possible to configure it to map the correct class when I do a getById. I assume i will have to store some class info?
What i dont want to do is the need to create seperate repository classes if i can avoid it, also i dont know what the object will be when i do a getById
If you are using spring-data-mongodb MongoRepository to write data in your database according to your entity model, a _class field will be added to document roots and to complex property types (see this section). This fields store the fully qualified name of the Java class and it allows disambiguation when mapping from MongoDb Document to Spring data model.
However, if you only use MongoRepository to read from your database, you need to tell Spring-data how to map your entities explicitly. You will need to Override Mapping with Explicit Converters.
PersonReadConverter.class
public class PersonReadConverter implements Converter<Document, Person> {
#Override
public Contact convert(Document source) {
if (source.get("attribute_specific_to_Clerk") != null) {
Clerk clerk = new Clerk();
//Set attributes using setters or defined constructor
return clerk;
}
else {
Manager manager = new Manager()
//Set attribute using setters or defined constructor
return manager;
}
}
}
Then, you have to Register Spring Converters with the MongoConverter.
You can find an example of my own at: Spring Data Mongo - How to map inherited POJO entities?
Currently i have one requirment our backend spring rest api will receive the data in encrypted json format (few fields are encrypted and few fields are plain text)
and then applies the decryption ,then applies some business logic on the data ,finally stores data into database.
This decryption logic is repeatining muliple service implementations methods .
So we desided to isolate the decryption logic from actual business logic.
i am using spring aop to decrypting the data and after decrypting same
object i am passing to service layer methods.
But my service layer methods contains the different types of objects as arguments
Ex :
processEmployee(EmployeeRequest request)
procesStudent(StudentRequest request)
I was looking for a way how can i dynamically change
the data on the same object fields it self ( Ex: EmployeeRequest,StudentRequest )
The following approach i have tried and stuck with 4th step.
1.Introduced a new annotation.
2.Annotate those fields which are having the encrypted data.
3.Retrieve all the annotated fields.
4.Each field data we will apply the decryption logic and
decrypted data will be injected on the same field again
i was looking for the api to achieve 4th step ?
Is there any api available to dynamically execute
the methods on the same object
or any reference please point me .
My suggestion is to not use the same POJO for both encrypted / decrypted class. It make confusing for future usage (as if I received an EmployeeRequest instance, is it decrypted or not?), and also limit the type (as your encrypted/decrypted data must be the same type).
Now, for the implementation, you have two choices:
Using explicit ConversionService
Register a converter:
#Component
public class EmployeeRequestConverter implements Converter<EmployeeRequest, EmployeeRequest> {
#Override
public EmployeeRequest convert(EmployeeRequest source) {
// Apply your decryption logic
}
}
Make the similar converters for other request objects.
Now in your controller:
public class MyController {
private ConversionService conversionService;
private MyService myService;
#RequestMapping(...)
public void aRequest(#RequestBody EmployeeRequest request) {
myService.execute(conversionService.convert(request, EmployeeRequest.class));
}
}
Using reflection.
Precondition: You have #Encrypted annotation on encrypted fields.
Unlike the first solution, you do not create explicit converter for each type of request.
#Service
public class DecryptionService {
public void <T> T decrypt(T input) {
Field[] fields = input.getClass().getDeclaredFields();
for (Field field : fields) {
Encrypted encrypted = field.getAnnotation(Encrypted.class);
if (encrypted != null) {
try{
field.setAccessible(true);
Object val = field.get(input);
// Base on #Encrypted annotation in your val, do the decryption
Object decryptedVal = ...;
field.set(input, decryptedVal);
} catch (Exception ex) {
}
}
}
}
}
Now you can apply this service for your controllers.
You might want to cache the Class.getDeclaredFields() and the mapping between Class<?> -> #Encrypted fields for performance.