I'm using Spring 3 + JPA 2 (hibernate Impl) + Spring MVC's contentNegotiationResolver based on Json media type and got the following exception stack trace is thrown while leveraging Spring framework's
org.springframework.web.servlet.view.json.MappingJacksonJsonView
*
org.codehaus.jackson.map.JsonMappingException: No serializer found for class org.hibernate.proxy.pojo.javassist.JavassistLazyInitializer and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS) ) (through reference chain:..... )
at org.codehaus.jackson.map.ser.StdSerializerProvider$1.serialize(StdSerializerProvider.java:62)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:268)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:146)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:118)
at org.codehaus.jackson.map.ser.ContainerSerializers$IndexedListSerializer.serializeContents(ContainerSerializers.java:236)
at org.codehaus.jackson.map.ser.ContainerSerializers$IndexedListSerializer.serializeContents(ContainerSerializers.java:189)
at org.codehaus.jackson.map.ser.ContainerSerializers$AsArraySerializer.serialize(ContainerSerializers.java:111)
at org.codehaus.jackson.map.ser.StdSerializerProvider._serializeValue(StdSerializerProvider.java:296)
at org.codehaus.jackson.map.ser.StdSerializerProvider.serializeValue(StdSerializerProvider.java:224)
at org.codehaus.jackson.map.ObjectMapper.writeValue(ObjectMapper.java:925)
at org.springframewor
*
The solution provided in the below thread is not working (using Json AutoDetect) doesnt work for me.
Strange Jackson exception being thrown when serializing Hibernate object
Providing Json AutoDetect and explicitly providing Json property annotation on getter method is not working - it still parses all the attributes. Also I've succeeded by avoiding certain attributes of JPA Entity class using #JsonIgnore which is kind of recursive or parent/child relation in nature. I've also tried detaching the entity class but no luck.
From my entity class, all the Jackson parser need to do is parse some three simple String attributes and rest of the attributes is marked with #JsonIgnore
Please let me know if any of you faced similar issue and resolved it.
Related
I'm migrating a legacy application from Spring-core 4 to Springboot 2.5.2.
The application is using spring-data-rest (SDR) alongside spring-data-mongodb to handle our entities.
The legacy code was overriding SDR configuration by extending the RepositoryRestMvcConfiguration and overriding the bean definition for persistentEntityJackson2Module to remove serializerModifier and deserializerModifier.
#EnableWebMvc
#EnableSpringDataWebSupport
#Configuration
class RepositoryConfiguration extends RepositoryRestMvcConfiguration {
...
...
#Bean
#Override
protected Module persistentEntityJackson2Module() {
// Remove existing Ser/DeserializerModifier because Spring data rest expect linked resources to be in href form. Our platform is not tailored for it yet
return ConverterHelper.configureSimpleModule((SimpleModule) super.persistentEntityJackson2Module())
.setDeserializerModifier(null)
.setSerializerModifier(null);
}
It was to avoid having to process DBRef as href link when posting entities, we pass the plain POJO instead of the href and we persist it manually before the entity.
Following the migration, there is no way to set the same overrided configuration but to avoid altering all our processes of creation we would like to keep passing the POJO even for DbRef.
I will add an exemple of what was working before :
We have the entity we want to persist :
public class EntityWithDbRefRelation {
....
#Valid
#CreateOnTheFly // Custom annotation to create the dbrefEntity before persisting the current entity
#DBRef
private MyDbRefEntity myDbRefEntity;
}
the DbRefEntity
public class MyDbRefEntity {
...
private String name;
}
and the JSON Post request we are doing:
POST base-api/entityWithDbRefRelations
{
...
"myDbRefEntity": {
"name": "My own dbRef entity"
}
}
In our database this request create our myDbRefEntity and then create the target entityWithDbRefRelation with a dbRef linked to the other entity.
Following the migration, the DBRef is never created because when deserializing the JSON into a PersistingEntity, the myDbRefEntity is ignored because it's expecting an href instead of a complex object.
I see 3 solutions :
Modify all our process to first create the DBRef through one request then create our entity with the link to the dbRef
Very costly as we have a lot of services creating entities through this backend
Compliant with SDR
Define our own rest mvc controllers to do operations, to ignore the SDR mapping machanism
Add AOP into the RepositoryRestMvcConfiguration around the persistentEntityJackson2Module to set le serializerModifier and deserializedModifier to null
I really prefer to avoid this solution as Springboot must have remove a way to configure it on purpose and it could break when migrating on newer version
Does anyone know a way to continue considering the property as a complex object instead of an href link except from my 3 previous points ?
Tell me if you need more information and thanks in advance for your help!
I have a controller with a method like
#PostMapping(value="/{reader}")
public String addToReadingList(#PathVariable("reader") String reader, Book book) {
book.setReader(reader);
readingListRepository.save(book);
return "redirect:/readingList/{reader}";
}
When I run a static code analysis with Sonarqube I get a vulnerability report stating that
Replace this persistent entity with a simple POJO or DTO object
But if I use a DTO (which has exactly the same fields as the entity class, then I get another error:
1 duplicated blocks of code must be removed
What should be the right solution?
Thanks in advance.
Enric
You should build a new separate class which represents your Entity ("Book" ) as Plain Old Java Object (POJO) or Data Transfer Object (DTO). If you use JSF or other stateful technology this rule is important. If your entity is stateful there might be open JPA sessions etc. which may modify your database (e.g. if you call a setter in JSF on a stateful bean).
For my projects I ignore this Sonar rule because of two reasons:
I alway you REST and REST will map my Java Class into JSON which can be seen as a DTO.
REST is stateless (no server session) so no database transaction will be open after the transformation to JSON
Information obtained from sonarsource official documentation.
On one side, Spring MVC automatically bind request parameters to beans
declared as arguments of methods annotated with #RequestMapping.
Because of this automatic binding feature, it’s possible to feed some
unexpected fields on the arguments of the #RequestMapping annotated
methods.
On the other end, persistent objects (#Entity or #Document) are linked
to the underlying database and updated automatically by a persistence
framework, such as Hibernate, JPA or Spring Data MongoDB.
These two facts combined together can lead to malicious attack: if a
persistent object is used as an argument of a method annotated with
#RequestMapping, it’s possible from a specially crafted user input, to
change the content of unexpected fields into the database.
For this reason, using #Entity or #Document objects as arguments of
methods annotated with #RequestMapping should be avoided.
In addition to #RequestMapping, this rule also considers the
annotations introduced in Spring Framework 4.3: #GetMapping,
#PostMapping, #PutMapping, #DeleteMapping, #PatchMapping.
See More Here
I'm using the Spring LDAP (docs) library in a Grails application. I have a class annotated with the #Entry annotation, so it is mapped to an LDAP server. This all works quite beautifully.
However, when I add the Grails #Validateable annotation (to enable validating the LDAP class similarly to Grails domain classes) and attempt to retrieve data from LDAP (i.e. a findAll operation on the LdapUserRepo, or similar), I get the following exception:
Message: Missing converter from class java.lang.String to interface org.springframework.validation.Errors, this is needed for field errors on Entry class com.ldap.portal.LdapUser
Basically, it seems like the AST transformation performed by the #Validateable annotation is producing extra fields (namely the errors field) on the LdapUser object. It appears that Spring LDAP, in processing the #Entry logic, assumes a default mapping for the fields property (probably interpreting it as a string field on the LDAP object). When it gets nothing from the LDAP server, it attempts to set the field of type ValidationErrors to a value of type String -- an empty string.
I did some looking in github and found this code that seems relevant and may support my theory.
My question is: is this behavior expected for annotations, and how can one prevent fields added by one annotation from being inappropriately processed by another annotation?
At present the best workaround I've come up with for my specific issue is to add an errors field to my LdapUser object and mark it as transient (so that LDAP ignores it):
#Transient
ValidationErrors errors
I'm struggling with a problem I and can not find out proper solution or even a cause neither in hibernate docs, sources nor S/O.
I have spring/hibernate application with DAO-Service-RPC layers, where DAO provides Hibernate entities and Service DTOs for RPC. Therefore I'm converting (mapping by Dozer) Entities to DTOs in service methods and mapping DTOs back to Entities there as well.
Mapping is as follows (not full method, checking ommited):
#Transactional
public updateAuthor(Author author) {
AuthorEntity existingEntity = this.authorDao.findById(author.getId());
this.authorDao.detach(existingEntity);
this.authorAssembler.toEntity(author, existingEntity, null);
this.authorDao.merge(existingEntity);
}
I have unit test classes annotated by #Transactional to avoid test data bleeding. Now, I realized, that there is something, I don't understand, going on in Service.
When I have my test class annotated by #Transactional, calling detach() seems to work (i.e. Hibernate is not reporting org.hibernate.NonUniqueObjectException: A different object with the same identifier value was already associated with the session, but #Version number on entity is not incremented properly (as if the parent (unit test)) TX was still holding on.
When I remove my test class annotation, mapping throws org.hibernate.LazyInitializationException: failed to lazily initialize a collection - which makes sense on it's own, because the entity is detached from session. What I don't understand is, why this exception is NOT thrown when the test class is annotated. As I understand, entity is detached from the current session in both cases.
My other question is - assuming the entity is behaving correctly - how to avoid missing such errors in unit tests and avoid test bleeding as well, as it seems to be, this type of error manifests only with unannotated test class.
Thank you!
(JUnit4, Spring 4.0.2.RELEASE, Hibernate 4.3.1.Final)
I use Spring Roo + jpa + hibernate and I would like to implement cross-validation (validation of several fields at the same time) in my application.
I am not sure how to go about implementing it. Can anyone please advise me and/or direct me to relevant documentation?
Have a look at Hibernate Validator, which allows entity validation (using annotations).
http://www.hibernate.org/subprojects/validator.html
In short, you annotate your field constraints by placing hibernate validator/ JPA annotations above them. (E.g. #Min(10)) and use the following piece of code to find any invalid fields;
ValidatorFactory factory = Validation.byDefaultProvider().configure().traversableResolver(new CustomTraversableResolver() ).buildValidatorFactory();
Validator validator = factory.getValidator();
Set<ConstraintViolation<BaseValidationObject>> constraintViolations = Validator.validate(myEntityToValidate);
If you need to validate specific relationships between entities, you can write custom validators to fit that need.