Can anybody explain how the MappingMongoConverter (Spring's default implementation of the MongoConverter interface) works for the cases where the mapping between POJO and Document isn't so trivial? Example cases: a POJO has an additional field it can't find in the Document, the Document has a structure that doesn't fit to the POJO,...
The official Spring documentation seems to lack this information.
Example code:
while (cursor.hasNext()) {
DBObject obj = cursor.next();
Foo foo = mongoTemplate.getConverter().read(Foo.class, obj);
returnList.add(foo);
}
The documentation is lacking, so had to dive into the source. I'll share my work.The tricky part is the POJO to BSON conversion:
The first thing it does is look for a #PersistenceConstructor annotation on a constructor. If no preferred constructor is set, the no arg constructor is used. The no mapping of the no arg constructor is simple enough. For the mapping of the preferred constructor, all parameters have to be present in the BSON. If a parameter can not be found, a MappingException will be thrown. This means that the BSON file can contain extra fields that don't have to map to a constructor parameter. Those parameters will just be ignored.
Related
This is the code that I have:
#Component
#Configuration
#PropertySource("application.properties")
public class Program {
#Value("${app.title}")
private String appTitle;
public Program() {
System.out.println(appTitle);
}
}
The application.properties has
app.title=The Program
The output is null insteaf of The Program.
So, what am I missing? I have tried several examples; none worked.
Since appTitle is an autowired field, it is not set until after the object is initially constructed. This is why the value is still null in your example. The bean construction process in this scenario is as follows:
The Program constructor is called, creating a new Program instance
The appTitle field is set on the newly constructed bean to ${app.title}
The ideal fix for this depends on your goals. If you truly need the value within the constructor, you can pass it in as an autowired constructor parameter. The value will then be available within the constructor:
#Component
#Configuration
#PropertySource("application.properties")
public class Program {
public Program(#Value("${app.title}") appTitle) {
System.out.println(appTitle);
}
}
If you don't need it in the constructor itself, but need it for the proper initialization of the bean, you could alternatively use the #javax.annotation.PostConstruct annotation to make use of it after the object's construction but before it is made available for use elsewhere:
#Component
#Configuration
#PropertySource("application.properties")
public class Program {
#Value("${app.title}")
private String appTitle;
#PostConstruct
public void printAppTitle() {
System.out.println(appTitle);
}
}
Finally, if you don't need the value at construction time, but need it during the life of the bean, what you have will work; it just won't be available within the body of the constructor itself:
#Component
#Configuration
#PropertySource("application.properties")
public class Program {
#Value("${app.title}")
private String appTitle;
}
Nothing wrong, just don't do it in a constructor...
Other answers on this question are written assuming the goal is creating a Spring-managed bean that uses the given property in its creation. However, based on your comments in another answer, it looks like the question you want answered is how to access an externalized property (one provided by #Value) within a no-argument constructor. This is based on your expectation that a Java inversion of control (IoC) container such as Spring should allow accessing externalized properties (and presumably other dependencies) within a no-argument constructor. That being the case, this answer will address the specific question of accessing the property within a no-argument constructor.
While there are certainly ways this goal could be achieved, none of them would be idiomatic usage of the Spring framework. As you discovered, autowired fields (i.e. fields initialized using setter injection) cannot be accessed within the constructor.
There are two parts to explaining why this is. First, why does it work the way it does, programmatically? Second, why was it designed the way it was?
The setter-based dependency injection section of the Spring docs addresses the first question:
Setter-based DI is accomplished by the container calling setter methods on your beans after invoking a no-argument constructor or a no-argument static factory method to instantiate your bean.
In this case, it means that first the object is created using the no-argument constructor. Second, once the object is constructed, the appTitle is initialized on the constructed bean. Since the field isn't initialized until after the object is constructed, it will have its default value of null within the constructor.
The second question is why Spring is designed this way, rather than somehow having access to the property within the constructor. The constructor-based or setter-based DI? sidebar within the Spring documentation makes it clear that constructor arguments are in fact the idiomatic approach when dealing with mandatory dependencies in general.
Since you can mix constructor-based and setter-based DI, it is a good rule of thumb to use constructors for mandatory dependencies and setter methods or configuration methods for optional dependencies. [...]
The Spring team generally advocates constructor injection, as it lets you implement application components as immutable objects and ensures that required dependencies are not null. Furthermore, constructor-injected components are always returned to the client (calling) code in a fully initialized state. [...]
Setter injection should primarily only be used for optional dependencies that can be assigned reasonable default values within the class. [...]
A property needed to construct the object certainly would be categorized as a mandatory dependency. Therefore, idiomatic Spring usage would be to pass in this required value in the constructor.
So in summary, trying to access an application property within a no-argument constructor is not supported by the Spring framework, and in fact runs contrary to the recommended use of the framework.
I'm learning how to use Mapstruct in a Spring Boot and Kotlin project.
I've got a generated DTO (ThessaurusDTO) that has a List and I need this mapped into a List on my model (Vocab).
It makes sense that MapStruct can't map this automatically, but I know for a fact that the first list will always be size = 1. I have no control on the API the DTO model belongs to.
I found on the documentation that I can create define a default method implementation within the interface, which would loosely translate to a normal function in Kotlin
My mapper interface:
#Mapper
interface VocabMapper {
#Mappings(
// ...
)
fun thessaurusToVocab(thessaurusDTO: ThessaurusDTO): Vocab
fun metaSyns(nestedList: List<List<String>>): List<String>
= nestedList.flatten()
}
When I try to do a build I get the following error:
VocabMapper.java:16: error: Can't map collection element "java.util.List<java.lang.String>" to "java.lang.String ". Consider to declare/implement a mapping method: "java.lang.String map(java.util.List<java.lang.String> value)".
It looks like mapStruct is still trying to automatically do the mapping while ignoring my custom implementation. Am I missing something trivial here?
I found on the documentation that I can create define a default method implementation within the interface, which would loosely translate to a normal function in Kotlin
From my understand of what I found online, Kotlin does not properly translate an interface function into a default method in Java, but actually generates a class that implements the interface.
If that's the problem, you can annotate metaSyns with #JvmDefault:
Specifies that a JVM default method should be generated for non-abstract Kotlin interface member.
Usages of this annotation require an explicit compilation argument to be specified: either -Xjvm-default=enable or -Xjvm-default=compatibility.
See the link for the difference, but you probably need -Xjvm-default=enable.
I've seen to have fixed this by relying on an abstract based implementation, instead of using an interface.
From my understand of what I found online, Kotlin does not properly translate an interface function into a default method in Java, but actually generates a class that implements the interface.
https://github.com/mapstruct/mapstruct/issues/1577
Is there an easy way to convert a JSON payload to a Java object using a custom ObjectMapper (Jackson) or do I have to provide a custom type converter. I know that I could use a processor, but somehow it would be nice to use input and output types of the stream definition.
In the second case: Am I even able to provide a custom type converter for application/json to Java?
The documentation states: "The customMessageConverters are added after the standard converters in the order defined. So it is generally easier to add converters for new media types than to replace existing converters."
I bet that there is an existing "application/json" converter - but at a first glance I could not find further information if it is even possible to replace existing converters.
Thanks!
Peter
If you look at streams.xml You can see the relevant configuration. The configured lists are used to construct a CompositeMessageConverter which visits every MessageConverter in list order until it finds one that can do the conversion and returns a non-null result. A CompositeConverter instance is created for each module instance that is configured for conversion (i.e., defines an inputType or outputType value) by filtering the list of candidate message converters, which all inherit AbstractFromMessageConverter. The list is paired down to those which respond true to public boolean supportsTargetMimeType(MimeType mimeType) (where mimeType is the value of the input/outputType). The CompositeMessageConverter is injected into the corresponding MessageChannel and converts the payload.
There are a couple of things you can do. You can override the xd.messageConverters bean definition. For example, you can replace JsonToPojoMessageConverter and PojoToJsonMessageConverter with your own subclasses. You can also insert your own implementations in the list before the above converters and have your implementation match only specific domain objects for which you need a custom JSON mapper.
Another possibility is to define your own mime type and provide converters for that mime type as customMessageConverters. In any case, follow these guidelines forextending Spring XD
I use jersey and have a method in my Resource class which has multiple parameters. These parameters are filled using #FormParam but the problem is, the type of the parameters are custom java types, not some primitives or String. I want to convert the value of parameters from json to custom java types. If I use #Cosume(MediaType.APPLICATION_JSON), then I cannot use multiple parameters and if I remove it, parameters cannot be converted from json to their java instances.
#POST #Path("/add")
#Consumes(MediaType.APPLICATION_FORM_URLENCODED)
#Produces(MediaType.APPLICATION_JSON)
public String add(#FormParam("source") BookEntity source, #FormParam("author") AuthorEntity a) throws JsonGenerationException, JsonMappingException, IOException, TransformationException
{
...
}
If I change the parameter types to String and then use Jackson deserialization, I can deserialize json parameters to java instances but I want to do it for other methods too and get it done automatically.
I tried to use the approach used in Custom Java type for consuming request parameters but I cannot make it work.
You can use a custom type mapper.
See this answer
Anyway by default Jersey tries to map received json object representation using JAXB. Obiously you must annotate your objects.
I just started developing some app in Java with spring-data-mongodb and came across some issue that I haven't been able to solve:
Have a couple of document beans like this:
#Document(collection="myBeanBar")
public class BarImpl implements Bar {
String id;
Foo foo;
// More fields and methods ...
}
#Docuemnt
public class FooImpl implements Foo {
String id;
String someField;
// some more fields and methods ...
}
And I have a repository class with a method that simply invokes a find similar to this:
public List<? extends Bar> findByFooField(final String fieldValue) {
Query query = Query.query(Criteria.where("foo.someField").is(fieldValue));
return getMongoOperations().find(query, BarImpl.class);
}
Saving a Bar works just fine, it would save it in mongo along with the "_class" attribute for both Foo and Bar. However, finding by some attribute in Foo would throw an exception like this:
Exception in thread "main" java.lang.IllegalArgumentException: No property someField found on test.Foo!
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentPropertyPath(AbstractMappingContext.java:225)
at org.springframework.data.mongodb.core.convert.QueryMapper.getPath(QueryMapper.java:202)
at org.springframework.data.mongodb.core.convert.QueryMapper.getTargetProperty(QueryMapper.java:190)
at org.springframework.data.mongodb.core.convert.QueryMapper.getMappedObject(QueryMapper.java:86)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1336)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1322)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:495)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:486)
Which, after some digging, makes some sense, since nowhere in the query is the sub-document concrete type being specified, and the Entity Information of Bar says the type of foo is Foo (not FooImpl), which in turn can not have properties cause it is an interface.
My question is: Is there a way to specify it or work-around this issue without declaring the sub-document type as a concrete type?
I've been googling it for a couple of days and looking at the documentation and API and the source code but I can not find a clear way to do it. I'd really appreciate your help.
Thank you very much.
I had a similar problem, I have a class that implements an interface and when I use findAll I get the error:
org.springframework.data.mapping.model.MappingInstantiationException: Could not instantiate bean class [test.MetaClasse]: Specified class is an interface.
After debugging SpringData code, I realized that Mapper uses #TypeAlias to discover the type it has to instantiate, so I just put #TypeAlias("FullClassName") on my implementations of test.MetaClasse and it worked!
I tested with your situation and it will work!
Like mentioned in this comment, the solution with having full class name in the type alias is imperfect as it might make refactoring cumbersome.
Instead you can just configure type mappings and make it work automagically. Here's how:
First you'll need to annotate BarImpl and FooImpl with #TypeAlias. It doesn't have to be a full class name, could be anything else. For example #TypeAlias("bar_impl") and #TypeAlias("foo_impl") respectively.
Then we’re going to need the reflections library. Pick the latest version for the build tool of your choice here.
For example with Gradle:
implementation("org.reflections:reflections:0.10.2")
Now we’re going to need a small extension to DefaultMongoTypeMapper to make it easy to configure and instantiate. Here’s how it would look in Kotlin:
class ReflectiveMongoTypeMapper(
private val reflections: Reflections = Reflections("com.example")
) : DefaultMongoTypeMapper(
DEFAULT_TYPE_KEY,
listOf(
ConfigurableTypeInformationMapper(
reflections.getTypesAnnotatedWith(TypeAlias::class.java).associateWith { clazz ->
getAnnotation(clazz, TypeAlias::class.java)!!.value
}
),
SimpleTypeInformationMapper(),
)
)
where com.example is either your base package or the package with MongoDB models.
This way we will find all classes annotated with #TypeAlias and register alias to type mappings.
Next we'll need to adjust the app's mongo configuration a bit. The configuration has to extend AbstractMongoClientConfiguration and we need to override method mappingMongoConverter to make use of the mapper we created before. It should look like this:
override fun mappingMongoConverter(
databaseFactory: MongoDatabaseFactory,
customConversions: MongoCustomConversions,
mappingContext: MongoMappingContext,
) = super.mappingMongoConverter(databaseFactory, customConversions, mappingContext).apply {
setTypeMapper(ReflectiveMongoTypeMapper())
}
Done!
Now all alias to type mappings will be registered automatically on context startup and all your polymorphic fields will work just fine.
You can check the full code example on GitHub.
Also, here's a blog post where you can read about the root cause of this issue as well as check other ways to solve it (in case you don't want to rely on reflection): https://blog.monosoul.dev/2022/09/16/spring-data-mongodb-polymorphic-fields/