How to get #TypeAlias working when reading document after upgrading to Spring Boot 2.1.3 when it was working in 1.4.5 - spring

I am currently using spring data mongodb and the Configuration file extends AbstractMongoConfiguration:
#Configuration
#EnableMongoRepositories(basePackages = "com.mycompany")
#EnableMongoAuditing
public class MongoConfig extends AbstractMongoConfiguration
{
I override the getMappingBasePackage() method to set the package to scan like this:
#Override
protected String getMappingBasePackage()
{
return "com.mycompany";
}
I have been debugging through the code and noticed some interesting things:
There are two place where I get a java.lang.InstantiationError. Both cases occur when I am trying to read in document from mongo that has a reference to an abstract class (ParentClass). It is trying to instantiate the abstract class instead of finding the #TypeAlias annotation which I have added to the child classes.
This is what my ParentClass looks like:
#Document
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME, include=JsonTypeInfo.As.EXISTING_PROPERTY, visible=true, property="type")
#JsonSubTypes({
#Type(value=Child1.class, name="JSON_TYPE_CHILD1"),
#Type(value=Child2.class, name="JSON_TYPE_CHILD2"),
#Type(value=Child3.class, name="JSON_TYPE_CHILD3")
})
public abstract class ParentClass
{
...
My child classes look like this:
#Document
#JsonTypeName("JSON_TYPE_CHILD1")
#TypeAlias("ALIAS_TYPE_CHILD1")
public class Child1 extends ParentClass
{
...
This is what the json looks like (simplified) that I am trying to read in:
{
"_id" : ObjectId("5c86d31388f13344f4098c64"),
"listOfWrapperClass" : [
{
"parentClass" : {
"type" : "JSON_TYPE_CHILD1",
"prop1" : 50.0,
"prop2" : 50.0,
"_class" : "ALIAS_TYPE_CHILD1"
},
"isReportOutOfDate" : false,
}
],
"_class" : "com.mycompany.domain.job.Job"
}
When I debug through spring data the problem occurs in DefaultTypeMapper:
private TypeInformation<?> getFromCacheOrCreate(Alias alias) {
Optional<TypeInformation<?>> typeInformation = typeCache.get(alias);
if (typeInformation == null) {
typeInformation = typeCache.computeIfAbsent(alias, getAlias);
}
return typeInformation.orElse(null);
}
It load the wrapper class fine, but when it gets to child class the alias is set to "ALIAS_TYPE_CHILD1" as it should but the following values are in the typeCache:
{
NONE=Optional.empty,
ALIAS_TYPE_CHILD1=Optional.empty,
com.mycompany.domain.job.Job=Optional[com.mycompany.domain.job.Job]
}
Because the key "ALIAS_TYPE_CHILD1" has an Optional.empty as a value the code doesn't get the correct target type to load and it therefore uses the rawType which is the ParentClass. Which blows up because it can't instantiate an abstract class. Here is the stacktrace:
Caused by: java.lang.InstantiationError: com.mycompany.domain.job.base.ParentClass
at com.mycompany.domain.job.base.ParentClass_Instantiator_q3kytg.newInstance(Unknown Source)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator$EntityInstantiatorAdapter.createInstance(ClassGeneratingEntityInstantiator.java:226)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:84)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:272)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:245)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1491)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1389)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readProperties(MappingMongoConverter.java:378)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.populateProperties(MappingMongoConverter.java:295)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:275)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:245)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readCollectionOrArray(MappingMongoConverter.java:1038)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1489)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1389)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readProperties(MappingMongoConverter.java:378)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.populateProperties(MappingMongoConverter.java:295)
at ...
The weird thing is if I insert a new document that has #TypeAlias("ALIAS_TYPE_CHILD1") first, the typeCache mentioned above is populated correctly like so:
{
NONE=Optional.empty,
ALIAS_TYPE_CHILD1=Optional[com.mycompany.domain.job.base.Child1],
com.mycompany.domain.job.Job=Optional[com.mycompany.domain.job.Job]
}
When I do findOne right after the insert, I can read in the document without error because it uses Child1 to instantiate the pojo instead of the ParentClass. If I try to read first then it doesn't matter if I insert after that or not because the typeCace gets the wrong value in there and it uses that until you restart the server.
My guess is there was a change in configuration or in a default setting. I was able to work through all the other upgrade issues, but this one has me baffled. I would be shocked if there is an actual issue in spring data because I am sure somebody would have run into this issue by now because I can't be the only one trying to use #TypeAlias with spring-data-mongodb. Not to mention this all works great with the previous version of spring boot that I used (1.4.5 which uses spring-data-mongodb 1.9.8.RELEASE).
Any thoughts or advice on what to try next are welcome. I am simply at a loss of what to do next.

The problem was the fact that the typeCache wasn't getting populated in the first place on server startup. This is because protected String getMappingBasePackage() is now deprecated. You should use protected Collection<String> getMappingBasePackages() instead and then everything works great.
Overriding this method solves the issue:
#Override
protected Collection<String> getMappingBasePackages()
{
return Arrays.asList("com.mycompany");
}

Related

Quarkus extension using a repository based on PanacheMongoRepository

I'm currently working on a Quarkus extension which is basically a filter that is using a PanacheMongoRepository. Here is a code snippet (this is in the runtime part of the extension) :
#Provider
#Priority(Priorities.AUTHORIZATION)
#AuthorizationSecured
public class AuthorizationFilter implements ContainerRequestFilter {
// Some injection here
#Inject
UserRepository userRepository;
#Override
public void filter(ContainerRequestContext requestContext) throws IOException {
// Some business logic here...
UserEntity userEntity = userRepository.findByName(name);
// Some business logic here...
}
}
The repository :
#ApplicationScoped
public class UserRepository implements PanacheMongoRepository<UserEntity> {
public UserEntity findByName(String name) {
return find("some query...", name).firstResult();
}
}
When the repository is called, I get the following exception:
org.jboss.resteasy.spi.UnhandledException: java.lang.IllegalStateException: This method is normally automatically overridden in subclasses...
java.lang.IllegalStateException: This method is normally automatically overridden in subclasses\n\tat io.quarkus.mongodb.panache.common.runtime.MongoOperations.implementationInjectionMissing(MongoOperations.java:765)\n\tat io.quarkus.mongodb.panache.PanacheMongoRepositoryBase.find(PanacheMongoRepositoryBase.java:119)
The processor
class AuthorizeProcessor {
private static final String FEATURE = "authorize";
#BuildStep
FeatureBuildItem feature() {
return new FeatureBuildItem(FEATURE);
}
#BuildStep(onlyIf = IsAuthorizeEnabled.class)
void registerAuthorizeFilter(
BuildProducer<AdditionalBeanBuildItem> additionalBeanProducer,
BuildProducer<ResteasyJaxrsProviderBuildItem> resteasyJaxrsProviderProducer
) {
additionalBeanProducer.produce(new AdditionalBeanBuildItem(UserRepository.class));
additionalBeanProducer.produce(new AdditionalBeanBuildItem(AuthorizationFilter.class));
resteasyJaxrsProviderProducer.produce(new ResteasyJaxrsProviderBuildItem(AuthorizationFilter.class.getName()));
}
}
Any idea ?
Thanks for your help :)
MongoDB with Panache (and the same for Hibernate with Panache) uses bytecode enhancement at build time. When this enhancement didn't occurs it leads to the exception you mentionned at runtime: java.lang.IllegalStateException: This method is normally automatically overridden in subclasses
It can occurs only when the repository or entity is not in the Jandex index. Jandex is used to index all the code of your application to avoid using reflection and classpath scanning to discover classes. If your entity / repository is not in the index this means it's not part of your application as we automatically index the classes of your application, so it must be inside an external JAR.
Usually, this is solved by adding the Jandex plugin to index the code of the external JAR (in fact there is multiple way to do this, see How to Generate a Jandex Index).
An extension suffer from the same issue as extensions are not indexed by default. But from an extension you can index the needed classes via a build step wich is more easy and avoid polluting the index with classes that are not needed.
This can be done by generating a new AdditionalIndexedClassesBuildItem(UserRepository.class.getName()) inside a build step.

SpringData Mongo projection ignore and overide the values on save

Let me explain my problem with SpringData mongo, I have the following interface declared, I declared a custom query, with a projection to ignore the index, this example is only for illustration, in real life I will ignore a bunch of fields.
public interface MyDomainRepo extends MongoRepository<MyDomain, String> {
#Query(fields="{ index: 0 }")
MyDomain findByCode(String code);
}
In my MongoDB instance, the MyDomain has the following info, MyDomain(code="mycode", info=null, index=19), so when I use the findByCode from MyDomainRepo I got the following info MyDomain(code="mycode", info=null, index=null), so far so good, because this is expected behaviour, but the problem happens when..., I decided to save the findByCode return.
For instance, in the following example, I got the findByCode return and set the info property to myinfo and I got the object bellow.
MyDomain(code="mycode", info="myinfo", index=null)
So I used the save from MyDomainRepo, the index was ignored as expected by the projection, but, when I save it back, with or without an update, the SpringData Mongo, overridden the index property to null, and consequently, my record on the MongoDB instance is overridden too, the following example it's my MongoDB JSON.
{
"_id": "5f061f9011b7cb497d4d2708",
"info": "myinfo",
"_class": "io.springmongo.models.MyDomain"
}
There's a way to tell to SpringData Mongo, to simply ignores the null fields on saving?
Save is a replace operation and you won't be able to signal it to patch some fields. It will replace the document with whatever you send
Your option is to use the extension provided by Spring Data Repository to define custom repository methods
public interface MyDomainRepositoryCustom {
void updateNonNull(MyDomain myDomain);
}
public class MyDomainRepositoryImpl implements MyDomainRepositoryCustom {
private final MongoTemplate mongoTemplate;
#Autowired
public BookRepositoryImpl(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Override
public void updateNonNull(MyDomain myDomain) {
//Populate the fileds you want to patch
Update update = Update.update("key1", "value1")
.update("key2", "value2");
// you can you Update.fromDocument(Document object, String... exclude) to
// create you document as well but then you need to make use of `MongoConverter`
//to convert your domain to document.
// create `queryToMatchId` to mtach the id
mongoTemplate.updateFirst(queryToMatchId, update, MyDomain.class);
}
}
public interface MyDomainRepository extends MongoRepository<..., ...>,
MyDomainRepositoryCustom {
}

Spring's couchbase JPA repository with abstract class fails to find entity

We are developing a project in Springboot that uses a Couchbase, I have following classes:
public abstract class Content {
...
}
public class Film extends Content {
...
}
public class Serie extends Content {
...
}
Then I have following JPA repository:
public interface ContentJpaRepository extends ReactiveCouchbaseSortingRepository<Content> {
}
Then, when I save a content (film or serie) the content is successfully saved, however, the _class field gets the simple class name (instead of the full package name).
Then, when doing:
repository.findById(id);
The repository fails as it can't deserialize the json document to the expected entity. How could I achieve that?
Thank you very much
Using a generic repository is currently not supported for Couchbase Spring Data, as the _class attribute will refer to the abstract class instead of its implementations.

How to change Hateoas output format in spring application?

I am currently working on a spring application that offers a REST interface with which CRUD operations on entities of various kinds can be performed. These entities are stored in repositories and thus a major part of the REST interface is automatically generated by spring. When I execute a GET request on such an entity type (e.g. /devices), the result looks as following:
{
"_embedded":{
"devices":[
{
"macAddress": "...",
"ipAddress": "...",
"name": "Device_1",
"id":"5c866db2f8ea1203bc3518e8",
"_links":{
"self":{
...
},
"device":{
...
}
}, ...
]
},
"_links":{
...
},
"page":{
"size":20,
"totalElements":11,
"totalPages":1,
"number":0
}
}
Now I need to implement a similar interface manually, because additional checks are required. I have made use of the spring-hateoas features for this purpose. However, I am unable to achieve the same output structure like the one automatically generated by spring. The corresponding code in my controller class (annotated with RestController) looks as follows:
#GetMapping("/devices")
public Resources<Device> getDevices() {
List<Device> deviceList = getDeviceListFromRepository();
Link selfRelLink = ControllerLinkBuilder.linkTo(
ControllerLinkBuilder.methodOn(RestDeviceController.class)
.getDevices())
.withSelfRel();
Resources<Device> resources = new Resources<>(deviceList);
resources.add(selfRelLink);
return resources;
}
The configuration (excerpt) looks as follows:
#Configuration
#EnableWebMvc
#EnableSpringDataWebSupport
#EnableHypermediaSupport(type = EnableHypermediaSupport.HypermediaType.HAL)
public class WebServletConfiguration extends WebMvcConfigurerAdapter implements ApplicationContextAware {
...
#Override
public void configureContentNegotiation(ContentNegotiationConfigurer c) {
c.defaultContentType(MediaTypes.HAL_JSON);
}
...
}
However, this is the output of a request:
{
"links":[
{
"rel":"self",
"href":"..."
}
],
"content":[
{
"id":"5c866db2f8ea1203bc3518e8",
"name":"Device_1",
"macAddress": "...",
"ipAddress":"...",
}
]
}
As you can see, instead of an _embedded key there is a content key and the links key misses the leading underscore. These are the main issues I have with this output, more detailed differences compared to the output above are not that important to me. I would like to unify the ouput generated by my application, but I am unable to achieve the output format of the mappings that are automatically generated by spring. I also tried to wrap the resources object into another resource object (like return new Resource<...>(resources)), this did not work as well though.
Do you have any hints for me about what I am doing wrong here? I am quite new to Spring & Co, so please tell me if you need more information about a certain thing. Any help is highly appreciated. Thanks in advance!
Finally I was able to find a solution: The strange output format as showed in the question was generated due to the accept header application/json that was sent by the client. After adding
#Override
public void configureContentNegotiation(ContentNegotiationConfigurer configurer) {
configurer.ignoreAcceptHeader(true);
configurer.defaultContentType(MediaTypes.HAL_JSON);
}
to class WebServletConfiguration which extends WebMvcConfigurerAdapter everything works as exptected and the output format is now HAL-like. A quite easy fix, but it took me weeks to figure this out. Maybe this answer will help somebody else in the future.

Why is this method in a Spring Data repository considered a query method?

We have implemented an application that should be able to use either JPA, Couchbase or MongoDB. (for now, may increase in the future). We successfully implemented JPA and Couchbase by separating repositories for each e.g. JPA will come from org.company.repository.jpa while couchbase will come from org.company.repository.cb. All repository interfaces extends a common repository found in org.company.repository. We are now targeting MongoDB by creating a new package org.company.repository.mongo. However we are encountering this error:
No property updateLastUsedDate found for type TokenHistory!
Here are our codes:
#Document
public class TokenHistory extends BaseEntity {
private String subject;
private Date lastUpdate;
// Getters and setters here...
}
Under org.company.repository.TokenHistoryRepository.java
#NoRepositoryBean
public interface TokenHistoryRepository<ID extends Serializable> extends TokenHistoryRepositoryCustom, BaseEntityRepository<TokenHistory, ID> {
// No problem here. Handled by Spring Data
TokenHistory findBySubject(#Param("subject") String subject);
}
// The custom method
interface TokenHistoryRepositoryCustom {
void updateLastUsedDate(#Param("subject") String subject);
}
Under org.company.repository.mongo.TokenHistoryMongoRepository.java
#RepositoryRestResource(path = "/token-history")
public interface TokenHistoryMongoRepository extends TokenHistoryRepository<String> {
TokenHistory findBySubject(#Param("subject") String subject);
}
class TokenHistoryMongoRepositoryCustomImpl {
public void updateLastUsedDate(String subject) {
//TODO implement this
}
}
And for Mongo Configuration
#Configuration
#Profile("mongo")
#EnableMongoRepositories(basePackages = {
"org.company.repository.mongo"
}, repositoryImplementationPostfix = "CustomImpl",
repositoryBaseClass = BaseEntityRepositoryMongoImpl.class
)
public class MongoConfig {
}
Setup is the same for both JPA and Couchbase but we didn't encountered that error. It was able to use the inner class with "CustomImpl" prefix, which should be the case base on the documentations.
Is there a problem in my setup or configuration for MongoDB?
Your TokenHistoryMongoRepositoryCustomImpl doesn't actually implement the TokenHistoryRepositoryCustom interface, which means that there's no way for us to find out that updateLastUsedDate(…) in the class found is considered to be an implementation of the interface method. Hence, it's considered a query method and then triggers the query derivation.
I highly doubt that this works for the other stores as claimed as the code inspecting query methods is shared in DefaultRepositoryInformation.

Resources