Spring deserialize YAML as LinkedHashMap instead of ArrayList - spring

I observe a difference between the Spring deserialization and a custom deserialization using a simple ObjectMapper.
application.yml:
props:
array:
- name: foo1
bar: bar1
- name: foo2
bar: bar2
In the properties file above, when we deserialize "props" as a generic Map (and not a user defined object), we observe that :
props become an instance of a LinkedHashMap
array become an instance of a LinkedHashMap which is incorrect
If we read the properties above with a new ObjectMapper(new YAMLFactory()), then we have the expected behaviour, where array is an instance of ArrayList.
We also got the expected result if we map the properties to a real Object, with an List as target for the array property.
I fail to understand the Spring deserialization mecanism, it looks like Spring does not use the ObjectMapper bean to deserialize (I override the Bean it changes nothing).
Which class to use/override to change the deserialization ?
I looked at Spring core converters, without success...
BTW, in my mind this looks like a Spring bug. (Spring 5.3.21, Java 17)

I don't know good solution with overriding Spring deserialization.
But I created setter method in #ConfigurationProperties class for my generic Map props.
And in this method I manually convert needed maps to arrays if all keys in map are sequence "0", "1", "2", ...
And don't forget about using recursion if you can have inner complex structures.

Related

Spring boot yaml property binding: collection types

I find Spring Boot's (or spring in general) handling of yaml collections to be a bit peculiar. Collections according to yaml specs should be written in .yaml files as:
myCollection: ['foo', 'bar']
or
myCollection:
- foo
- bar
But neither #Value("${myCollection}") annotation or Environment.getProperty("myCollection", String[].class) (also tried List.class) can read collection properties (returns null). The only method I know of that works is to use #ConfigurationProperties annotation described in spring boot docs.
The problem with #ConfigurationProperties annotation is that (a) it is too verbose if all I want is a single property and (b) it rely on bean injection to get an instance of the #ConfigurationProperties class. Under some circumstances, bean injection is not available and all we have is a reference to Environment (e.g: thru ApplicationContext).
In my particular case, I want to read some properties during ApplicationEnvironmentPreparedEvent event, since it happens before context is built, the listener has to be manually registered and therefore, no bean injection. Via the event argument, I can get a reference to Environment. So, I can read other properties but cannot read collections.
A couple of "solutions" I noted (quoted because I don't find them very satisfactory):
Specify collections in .yaml file as myCollection: foo, bar. But this is not ideal because, the format isn't really yaml anymore.
Read individual elements using an index, for example Environment.getProperty("myCollection[0]", String.class). Will require some not-so-elegant utility methods to read and put all elements into a List.
So, my questions is - What is a good way to read collection-type properties if I cannot use #ConfigurationProperties? Also curious why comma-separated format works but not yaml-style collections.
EDIT: corrected some typos
Quite Frankly Spring boot application.properties and application.yaml or application.yml is meant to load configuration properties.
The #ConfigurationProperties annotation is designed as an abstraction to hide the implementations of configuration properties and support both .properties and .yaml/.yml.
For yaml/yml however Spring uses org.yaml.snakeyaml.Yaml library underneath to parse the file and load it to a Properties object inside org.springframework.boot.env.YamlPropertySourceLoader and a Collection is mapped as a Set not an array or List. So you try doing the following;
Environment.getProperty("myCollection", Set.class)

Assign ArrayList from the data in propeties file

this is my property file.
REDCA_IF_00001=com.sds.redca.biz.svc.RedCAIF00001SVC
REDCA_IF_00002=com.sds.redca.biz.svc.RedCAIF00002SVC
REDCA_IF_00003=com.sds.redca.biz.svc.RedCAIF00003SVC
REDCA_IF_00004=com.sds.redca.biz.svc.RedCAIF00004SVC
and I want to these values into hashmap in my spring context file.
How can I achieve this?
Does it have to be a HashMap or any kind of Map would be fine?
Because you can define that as a java.util.Properties instance (Spring has great support for properties loading), which already implements Map (it actually extends from Hashtable).

How can I use a CustomConverter with Dozer to convert multiple entities into one entity with a list field?

I have a list of entities which in turn have a field of another (Embeddable) type.
All these entities shall be converted into a single bean which holds a list of these embeddable types.
Prior to using Dozer I have written a conversion method. I have put this into the dozerBeanMapping.xml:
<custom-converters>
<converter type="com.foo.bar.helper.ChargingPoiEntityToPoiConverter" >
<class-a>com.foo.bar.services.charging.repository.ChargingPoiEntity</class-a>
<class-b>com.foo.bar.beans.ChargingPoi</class-b>
</converter>
</custom-converters>
I instantiate Dozer this way:
final Mapper mapper = DozerBeanMapperSingletonWrapper.getInstance();
Which map method do I have to invoke?
Using
mapper.map(cpEntities, Cp.class);
my custom converter is not invoked.
Trying to invoke
mapper.map(cpEntities.get(0), Cp.class);
works well, but I have to convert a List<ChargingPoiEntity> instead of a single ChargingPoiEntity.
how can I achieve this?
mapper.map(cpEntities, Cp.class); is not matching your custom converter because the generic type information in List<ChargingPoiEntity> is lost. Dozer sees the class of cpEntities as java.util.ArrayList, which does not match com.foo.bar.services.charging.repository.ChargingPoiEntity. My understanding is that this is a limitation of Java generics, not an issue in Dozer.
One workaround is to define a custom converter between a ChargingPoiEntity array and a ChargingPoi:
<custom-converters>
<converter type="com.foo.bar.helper.ChargingPoiEntityToPoiConverter" >
<class-a>[Lcom.foo.bar.services.charging.repository.ChargingPoiEntity;</class-a>
<class-b>com.foo.bar.beans.ChargingPoi</class-b>
</converter>
</custom-converters>
When mapping, you can convert the cpEntities list to an array:
ChargingPoiEntity[] entityArray = cpEntities.toArray(
new ChargingPoiEntity[cpEntities.size()]);
ChargingPoi convertedList = mapper.map(entityArray, ChargingPoi.class);
Note that in this case, the custom converter will not be invoked when you do
mapper.map(cpEntities.get(0), ChargingPoi.class);
This problem should only apply when attempting to map generic collections directly via mapper.map(...); entities containing generic collections as fields should map fine.

Spring's MappingMongoConverter documentation

Can anybody explain how the MappingMongoConverter (Spring's default implementation of the MongoConverter interface) works for the cases where the mapping between POJO and Document isn't so trivial? Example cases: a POJO has an additional field it can't find in the Document, the Document has a structure that doesn't fit to the POJO,...
The official Spring documentation seems to lack this information.
Example code:
while (cursor.hasNext()) {
DBObject obj = cursor.next();
Foo foo = mongoTemplate.getConverter().read(Foo.class, obj);
returnList.add(foo);
}
The documentation is lacking, so had to dive into the source. I'll share my work.The tricky part is the POJO to BSON conversion:
The first thing it does is look for a #PersistenceConstructor annotation on a constructor. If no preferred constructor is set, the no arg constructor is used. The no mapping of the no arg constructor is simple enough. For the mapping of the preferred constructor, all parameters have to be present in the BSON. If a parameter can not be found, a MappingException will be thrown. This means that the BSON file can contain extra fields that don't have to map to a constructor parameter. Those parameters will just be ignored.

How to get class name of String object?

I set a bean's property to a String object, then when I try to get the class name of the property ,below error is thrown out:
Expected hash. plist[0].javaType evaluated instead to freemarker.template.SimpleScalar on line 7, column 26 in ibatis/macro.ftl.
template code is as below:
<#assign clsName=plist[0].javaType.class.name>
When property javaType is set to a java bean, class name can be properly got. Why is it? I need the property could be given any type, java bean ,non java bean.
The root of the issue here is that FreeMarker doesn't work with Java values/objects directly. The template language has its own simple type-system, and stuff coming from outside is mapped to that through a technique called object-wrapping. (Values that doesn't come from outside doesn't even have a wrapped object inside.) That you was still able to get the class of some object is purely accidental... What happens is that the object-wrapping machinery decides that the object should be mapped to the "hash" FreeMarker type, and the hash items will correspond to the JavaBean properties of the objects. The object has a getClass() method, which is (mistakenly) seen as the getter of the "class" property.
So there's no universal way of getting the class... among others because sometimes there's no class to get. You could write a TemplateMethodModelEx that does a good enough effort to do so.

Resources