Custom comparator on specific property - javers

Is it possible to configure a custom comparator on a named property on class. I can configure a custom comparator on, for example, all BigDecimal instances, but I would like to use my comparator on specific property.

Related

Spring Kafka - Override ConsumerFactory's Deserializer Class provided as an object

Is it possible to override the ConsumerFactory's configured deserializers if provided as an object instead of a class name in the properties? Maybe in a ContainerCustomizer?
No; it is not possible - the only way you can provide an object is via the factory constructors or setters before creating a consumer; you cannot override these at all, not even with properties, because they are passed directly to the KafkaConsumer and they will be used instead of any specified in the properties.
See ConsumerConfig.appendDeserializerToConfig().
You would need to use a different consumer factory if you want to override the defaults with objects.

Can MongoTemplate provide automatic translation?

I have a simple persistent pojo like:
public class Peristent {
private String unsafe;
}
I use Spring Data mongoTemplate to persist and fetch the above object. I also need to encrypt the Persistent.unsafe variable and store a complex representation of that in backend, everytime I try to save Persistent object.
Can I annotate Persistent, or provide some sort of hooks where I can make the aforementioned translations without me having to do that in the Pojo code manually. This has to happen automatically during mongoTemplate.insert.
Spring Data currently only support Type based conversions. There is an issue for supporting property based conversion, which you might want to track.
Therefore annotating won't work. What you could do is, create use a separate class for the property, which just wraps the String and register a custom converter for that type. See http://docs.spring.io/spring-data/data-mongo/docs/1.10.4.RELEASE/reference/html/#mongo.custom-converters for details, how to do that.

Spring REST input validation

I'm writing a REST service using Spring Boot and JPA. I need to be able to validate some of the input fields and I want to ensure I'm using a proper pattern for doing so.
Let's assume I have the following model and I also have no control over the model:
{
"company" : "ACME"
"record_id" : "ACME-123"
"pin" : "12345"
"company_name" : ""
"record_type" : 0
"acl" : ['View','Modify']
"language" : "E"
}
The things I need to do are:
Ensure the value is not empty - This seems simple enough using the #NotEmpty annotation and I can pass a message.
Ensure the value is part of a valid list of values - The example here is the language property in the model above. I want the value to be either E,F or S. This seems possible using a custom annotation (eg #ValidValue({"E","F","S"})) but is there a better/"Springy" way to do this?
Ensure the values in a list are part of a valid list of values - The example here is the acl property. Again this seems possible with a custom annotation like #ValidListValues({"View", "Modify", "Delete", "Hide"}) but same question as above.
Set a default value - From what I read, custom validator annotations are only able to validate and not modify. I would like to do something like #DefaultValue(value=5) if the value is null. Is this possible? More on this below.
Set a default value to the return of a static method - For example if the pin field in model above isn't set, I want to set it to something like Util.getRandomDigitsAsString(5).
Use values from another property - I would like to validate that one property contains the string from another property. Using the example model, I want to ensure that record_id starts with company.
I have this setup in what I believe is a standard way with the controller -> service -> DTO -> DAO -> Model. Another option I was thinking about was creating a method in the validateCreate() that would go through all of the items above and throw an exception if needed.
Thanks.
Yes, NotEmpty is the right way
You should define a Language enum. The language field of your POJO should be of type Language
Same as 2. Define an Acl enum.
Define that in your Java code. Initialize the value of the field to 5 by default. If the JSON contains a value, Jackson will set the field value to the value in the JSON. Otherwise, it will stay as 5. Or initialize the field to null, and add a method getValueOrDefault(int defaultValue) that returns the default value you want if the value is null.
Same as 4
Define a custom validator that applies on the class itself, rather than a property of the class. In the validator chec that the two related values are correct.

How can I use a CustomConverter with Dozer to convert multiple entities into one entity with a list field?

I have a list of entities which in turn have a field of another (Embeddable) type.
All these entities shall be converted into a single bean which holds a list of these embeddable types.
Prior to using Dozer I have written a conversion method. I have put this into the dozerBeanMapping.xml:
<custom-converters>
<converter type="com.foo.bar.helper.ChargingPoiEntityToPoiConverter" >
<class-a>com.foo.bar.services.charging.repository.ChargingPoiEntity</class-a>
<class-b>com.foo.bar.beans.ChargingPoi</class-b>
</converter>
</custom-converters>
I instantiate Dozer this way:
final Mapper mapper = DozerBeanMapperSingletonWrapper.getInstance();
Which map method do I have to invoke?
Using
mapper.map(cpEntities, Cp.class);
my custom converter is not invoked.
Trying to invoke
mapper.map(cpEntities.get(0), Cp.class);
works well, but I have to convert a List<ChargingPoiEntity> instead of a single ChargingPoiEntity.
how can I achieve this?
mapper.map(cpEntities, Cp.class); is not matching your custom converter because the generic type information in List<ChargingPoiEntity> is lost. Dozer sees the class of cpEntities as java.util.ArrayList, which does not match com.foo.bar.services.charging.repository.ChargingPoiEntity. My understanding is that this is a limitation of Java generics, not an issue in Dozer.
One workaround is to define a custom converter between a ChargingPoiEntity array and a ChargingPoi:
<custom-converters>
<converter type="com.foo.bar.helper.ChargingPoiEntityToPoiConverter" >
<class-a>[Lcom.foo.bar.services.charging.repository.ChargingPoiEntity;</class-a>
<class-b>com.foo.bar.beans.ChargingPoi</class-b>
</converter>
</custom-converters>
When mapping, you can convert the cpEntities list to an array:
ChargingPoiEntity[] entityArray = cpEntities.toArray(
new ChargingPoiEntity[cpEntities.size()]);
ChargingPoi convertedList = mapper.map(entityArray, ChargingPoi.class);
Note that in this case, the custom converter will not be invoked when you do
mapper.map(cpEntities.get(0), ChargingPoi.class);
This problem should only apply when attempting to map generic collections directly via mapper.map(...); entities containing generic collections as fields should map fine.

Implementing a custom Hadoop key type and Value type

I want to emit key and value as custom datatype.
whether I should implement 2 classes for key and value?
one KeyWritable implements WritableComparable and
the other one
ValueWritable implements Writable.
Is it like that or one
WritableComparable is enough for emiting custom key and value.
If you want to use same class for your key and value, Then you need to write only one custom class which implements WritableComparable Interface.
A class which implements WritableComparable interface can be used for Key and Value. That means your new custom class will be Writable and Comparable also.
The super interfaces of WritableComparable are Writable and Comparable.
Please verify this
There is no need to write KeyWritable separate if your key is not custom object and it's only text or string value.
In Hadoop, every data type to be used as keys must implement Writable and Comparable interfaces or more conveniently WritableComparable interface, and every data type to be used as values must implement Writable interface.
If your custom key and value are of the same type, then you could write one custom data type which implements WritableComparable interface. If your custom key type differs from your custom value type, you will have to write two separate custom data types, where custom key class implements WritableComparable interface and custom value class implements Writable interface. Note that usually developers tend to use WritableComparable interface when they write custom data types since they can use interchangeably as custom keys and custom values. HTH

Resources