Object Mapper generic deserialize - spring

I have the problem with Object Mapper deserialize.
I use hibernate + spring boot and needed to deserialize json array for entity field.
This is my hibernate AttributeConverter:
abstract class JsonConverter<T>: AttributeConverter<Collection<T>, String> {
abstract val typeReference: TypeReference<out Collection<T>>
override fun convertToDatabaseColumn(attribute: Collection<T>?): String? {
if (attribute.isNullOrEmpty())
return null
return ObjectMapper.writeValueAsString(attribute)
}
}
implementation for collection
abstract class ListConverter<T>: JsonConverter<T>() {
override val typeReference = object : TypeReference<List<T>>() {}
override fun convertToEntityAttribute(dbData: String?): Collection<T> {
if (dbData == null)
return emptyList()
return ObjectMapper.readValue(dbData, typeReference)
}
}
abstract class SetConverter<T>: JsonConverter<T>() {
override val typeReference = object : TypeReference<Set<T>>() {}
override fun convertToEntityAttribute(dbData: String?): Collection<T> {
if (dbData == null)
return emptySet()
return ObjectMapper.readValue(dbData, typeReference)
}
}
Hibernate entity field
#Column(name = "product_ids")
#Convert(converter = LongSetConverter::class)
var productIds: Set<Long>,
implementation AttributeConverter for this entity field
class LongSetConverter: SetConverter<Long>()
The problem is that Long Generic not working and Object Mapper always return Set of Int
Long (or another type) just ingnore

Related

Spring, how do I store java.lang.Class type in mongodb

I'm trying to store java.lang.Class in MongoDb using ReactiveCrudRepository, but I got this following errors.
#Document
data class Letter(
...,
val messageType: Class<*>
)
Can't find a codec for class java.lang.Class.
I tried implementing my custom conversions, but it converts other properties that has type String to java.lang.Class too.
#Bean
fun customConversions(): MongoCustomConversions {
val converters = ArrayList<Converter<*, *>>()
converters.add(object: Converter<String, Class<*>> {
override fun convert(source: String): Class<*> {
return Class.forName(source)
}
})
converters.add(object: Converter<Class<*>, String> {
override fun convert(source: Class<*>): String {
return source.name
}
})
return MongoCustomConversions(converters)
}
You could try using Property Converters. See example bellow:
class ReversingValueConverter implements PropertyValueConverter<String, String, ValueConversionContext> {
#Override
public String read(String value, ValueConversionContext context) {
return reverse(value);
}
#Override
public String write(String value, ValueConversionContext context) {
return reverse(value);
}
}
class Person {
#ValueConverter(ReversingValueConverter.class)
String ssn;
}
See Spring Data MongoDB Reference Documentation for more information.

Spring cache, force to set the return type

I'm trying to implement org.springframework.cache.Cache
The cached value is stored as JSON in a SQL database.
In the Cache interface, there are multiple get methods.
ValueWrapper get(Object key);
<T> T get(Object key, #Nullable Class<T> type);
<T> T get(Object key, Callable<T> valueLoader);
This is the first one that is used (without type or any generic information).
The problem is that since I save the value as JSON, I'd like to have the return value of the cached methods to help deserialize it.
How can I force spring to use the method <T> T get(Object key, #Nullable Class<T> type); when using the #Cacheable annotation ?
My cache implementation (kotlin) :
class SqlCache(
private val name: String,
private val expiration: Duration,
private val cacheRepository: CacheRepository,
): Cache {
private val isoObjectMapper = ObjectMapper()
.registerModule(KotlinModule())
.registerModule(JavaTimeModule())
.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES)
override fun getName(): String {
return name
}
override fun getNativeCache(): Any {
return cacheRepository
}
// The method called by spring !
override fun get(key: Any): Cache.ValueWrapper? {
val cache = cacheRepository.find(name = name, key = key.toString()) ?: return null
val value = isoObjectMapper.readValue(cache.value, UserModel::class)
return SimpleValueWrapper(value)
}
override fun <T : Any?> get(key: Any, type: Class<T>?): T? {
val cache = cacheRepository.find(name = name, key = key.toString()) ?: return null
return isoObjectMapper.readValue(cache.value, type)
}
override fun <T : Any?> get(key: Any, valueLoader: Callable<T>): T? {
return null
}
override fun put(key: Any, value: Any?) {
cacheRepository.put(
name = name,
key = key.toString(),
value = value,
expiration = expiration,
)
}
override fun evict(key: Any) {
cacheRepository.delete(
name = name,
key = key.toString(),
)
}
override fun clear() {
}
}
Exemple of cache usage :
interface UserClientAdapter {
#Cacheable(value = ["user-cache"], key = "#id")
fun getUser(id: UUID): UserModel
}
So, in this last method, the user is well stored as a JSON string in the database.
But when we try to get back the cache. This is the method ValueWrapper get(Object key); that is called. So I don't know the expected return type of the method.
You should refer to the RedisCache implementation class.
extends AbstractValueAdaptingCache
Implement the lookup method
Write your deserialization method in the lookup method
protected Object lookup(Object key) {
byte[] value = this.cacheWriter.get(this.name, this.createAndConvertCacheKey(key));
return value == null ? null : this.deserializeCacheValue(value);
}
If you still can't return your results
Then you need to debug CacheAspectSupport.java execute method,
which handles the hit Cache.ValueWrapper

How to initialize variables in parent abstract class of spring bean using Kotlin?

I have the next structure of spring beans
abstract class GenericRepository<T> {
private val FIND_BY_ID_SQL = "SELECT * FROM ${this.getTableName()} WHERE id = ?"
abstract fun getTableName(): String
abstract fun jdbcTemplate(): JdbcTemplate
abstract fun getMapper(): RowMapper<T>
fun find(id: Long): T? {
return jdbcTemplate().queryForObject(FIND_BY_ID_SQL, arrayOf(id), getMapper())
}
}
User repository
#Repository
class UserRepository(
#Autowired
private val jdbcTemplate: JdbcTemplate
) : GenericRepository<User>() {
companion object {
private const val INSERT_SQL = "INSERT INTO \"user\"(name, age) VALUES (?,?)"
}
private class LogMapper : RowMapper<User> {
override fun mapRow(rs: ResultSet, rowNum: Int): User? {
return User(
id = rs.getLong("id"),
name = rs.getString("name"),
age = rs.getInt("operation")
)
}
}
override fun getTableName(): String {
return "user"
}
override fun jdbcTemplate(): JdbcTemplate {
return jdbcTemplate
}
override fun getMapper(): RowMapper<User> {
return LogMapper()
}
}
The problem when Spring creates proxy and creates bean of UserRepository it doesn't initialize FIND_BY_ID_SQL leaving it null.
The question: how usign abstract class make spring initialize FIND_BY_ID_SQL variable?
UPD
I used #Component instead of #Repository and the problem was solved. FIND_BY_ID_SQL is not null anymore.
You could work around the problem by making it lazy:
private val FIND_BY_ID_SQL by lazy { "SELECT * FROM ${this.getTableName()} WHERE id = ?" }
However, you should first be sure it's an actual problem (e.g. that when you call find you get an exception), because the proxy might simply delegate to a "real" UserRepository with non-null FIND_BY_ID_SQL (and jdbcTemplate etc.), depending on Spring's internal details.
In addition, you need to be careful when your superclass properties are initialized depending on subclass; I think your exact situation should work, but I'd prefer to write it as
abstract class GenericRepository<T>(val tableName: String) {
private val FIND_BY_ID_SQL = "SELECT * FROM ${tableName} WHERE id = ?"
abstract val jdbcTemplate: JdbcTemplate
abstract val mapper: RowMapper<T>
fun find(id: Long): T? {
return jdbcTemplate.queryForObject(FIND_BY_ID_SQL, arrayOf(id), mapper)
}
}
#Repository
class UserRepository(
#Autowired
override val jdbcTemplate: JdbcTemplate
) : GenericRepository<User>("user") { ... }

Why does Spring #Cacheable not pass the annotated method's result type to its deserializer?

This is sample code with Kotlin.
#Configuration
#Bean("cacheManager1hour")
fun cacheManager1hour(#Qualifier("cacheConfig") cacheConfiguration: RedisCacheConfiguration, redisConnectionFactory: RedisConnectionFactory): CacheManager {
cacheConfiguration.entryTtl(Duration.ofSeconds(60 * 60))
return RedisCacheManager.builder(redisConnectionFactory)
.cacheDefaults(cacheConfiguration)
.build()
}
#Bean("cacheConfig")
fun cacheConfig(objectMapper:ObjectMapper): RedisCacheConfiguration {
return RedisCacheConfiguration.defaultCacheConfig()
.computePrefixWith { cacheName -> "yaya:$cacheName:" }
.serializeKeysWith(RedisSerializationContext.SerializationPair.fromSerializer(StringRedisSerializer()))
.serializeValuesWith(RedisSerializationContext.SerializationPair.fromSerializer(GenericJackson2JsonRedisSerializer()))
}
#RestController
#Cacheable(value = "book", key = "#root.methodName", cacheManager = "cacheManager1hour")
fun getBook(): Book {
return Book()
}
class Book {
var asdasd:String? = "TEST"
var expires_in = 123
}
The GenericJackson2JsonRedisSerializer cannot process the "kotlin class" and we need to add '#class as property' to the Redis cache entry.
Anyway, why do we need the #class? The Spring context is aware of the result's type, why doesn't it get passed? We would have two benefits:
less memory
easy for the Serializer, i.e. objectMapper.readValue(str, T)
Annotated Spring code for illustration
// org.springframework.cache.interceptor.CacheAspectSupport
#Nullable
private Cache.ValueWrapper findInCaches(CacheOperationContext context, Object key) {
for (Cache cache : context.getCaches()) {
// --> maybe we can pass the context.method.returnType to doGet
Cache.ValueWrapper wrapper = doGet(cache, key);
if (wrapper != null) {
if (logger.isTraceEnabled()) {
logger.trace("Cache entry for key '" + key + "' found in cache '" +
cache.getName() + "'");
}
return wrapper;
}
}
return null;
}
// org.springframework.data.redis.cache.RedisCache
#Override
protected Object lookup(Object key) {
// -> there will get the deserialized type can pass to Jackson
byte[] value = cacheWriter.get(name, createAndConvertCacheKey(key));
if (value == null) {
return null;
}
return deserializeCacheValue(value);
}
Your return type could be:
some abstract class
some interface
In those cases your return type is almost useless to deserialize the object. Encoding the actual class always works .

Spring -Mongodb storing/retrieving enums as int not string

My enums are stored as int in mongodb (from C# app). Now in Java, when I try to retrieve them, it throws an exception (it seems enum can be converted from string value only). Is there any way I can do it?
Also when I save some collections into mongodb (from Java), it converts enum values to string (not their value/cardinal). Is there any override available?
This can be achieved by writing mongodb-converter on class level but I don't want to write mondodb-converter for each class as these enums are in many different classes.
So do we have something on the field level?
After a long digging in the spring-mongodb converter code,
Ok i finished and now it's working :) here it is (if there is simpler solution i will be happy see as well, this is what i've done ) :
first define :
public interface IntEnumConvertable {
public int getValue();
}
and a simple enum that implements it :
public enum tester implements IntEnumConvertable{
vali(0),secondvali(1),thirdvali(5);
private final int val;
private tester(int num)
{
val = num;
}
public int getValue(){
return val;
}
}
Ok, now you will now need 2 converters , one is simple ,
the other is more complex. the simple one (this simple baby is also handling the simple convert and returns a string when cast is not possible, that is great if you want to have enum stored as strings and for enum that are numbers to be stored as integers) :
public class IntegerEnumConverters {
#WritingConverter
public static class EnumToIntegerConverter implements Converter<Enum<?>, Object> {
#Override
public Object convert(Enum<?> source) {
if(source instanceof IntEnumConvertable)
{
return ((IntEnumConvertable)(source)).getValue();
}
else
{
return source.name();
}
}
}
}
the more complex one , is actually a converter factory :
public class IntegerToEnumConverterFactory implements ConverterFactory<Integer, Enum> {
#Override
public <T extends Enum> Converter<Integer, T> getConverter(Class<T> targetType) {
Class<?> enumType = targetType;
while (enumType != null && !enumType.isEnum()) {
enumType = enumType.getSuperclass();
}
if (enumType == null) {
throw new IllegalArgumentException(
"The target type " + targetType.getName() + " does not refer to an enum");
}
return new IntegerToEnum(enumType);
}
#ReadingConverter
public static class IntegerToEnum<T extends Enum> implements Converter<Integer, Enum> {
private final Class<T> enumType;
public IntegerToEnum(Class<T> enumType) {
this.enumType = enumType;
}
#Override
public Enum convert(Integer source) {
for(T t : enumType.getEnumConstants()) {
if(t instanceof IntEnumConvertable)
{
if(((IntEnumConvertable)t).getValue() == source.intValue()) {
return t;
}
}
}
return null;
}
}
}
and now for the hack part , i personnaly didnt find any "programmitacly" way to register a converter factory within a mongoConverter , so i digged in the code and with a little casting , here it is (put this 2 babies functions in your #Configuration class)
#Bean
public CustomConversions customConversions() {
List<Converter<?, ?>> converters = new ArrayList<Converter<?, ?>>();
converters.add(new IntegerEnumConverters.EnumToIntegerConverter());
// this is a dummy registration , actually it's a work-around because
// spring-mongodb doesnt has the option to reg converter factory.
// so we reg the converter that our factory uses.
converters.add(new IntegerToEnumConverterFactory.IntegerToEnum(null));
return new CustomConversions(converters);
}
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setApplicationContext(appContext);
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter mongoConverter = new MappingMongoConverter(dbRefResolver, mappingContext);
mongoConverter.setCustomConversions(customConversions());
ConversionService convService = mongoConverter.getConversionService();
((GenericConversionService)convService).addConverterFactory(new IntegerToEnumConverterFactory());
mongoConverter.afterPropertiesSet();
return mongoConverter;
}
You will need to implement your custom converters and register it with spring.
http://static.springsource.org/spring-data/data-mongo/docs/current/reference/html/#mongo.custom-converters
Isn't it easier to use plain constants rather than an enum...
int SOMETHING = 33;
int OTHER_THING = 55;
or
public class Role {
public static final Stirng ROLE_USER = "ROLE_USER",
ROLE_LOOSER = "ROLE_LOOSER";
}
String yourRole = Role.ROLE_LOOSER

Resources