I have a Json Array in my request object and POJO object, I need to save that jsonarray field into jsonb column in my postgresql table.
"productId": "1",
"product":[
{
"name": "PENCIL",
"quantity":"5"
}]
I am getting an exception when trying to hit it via postman, getting exception:
Type definition error: [simple type, class io.r2dbc.postgresql.codec.Json]; nested exception is com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of io.r2dbc.postgresql.codec.Json (no Creators, like default constructor, exist): abstract types either need to be mapped to concrete types, have custom deserializer, or contain additional type information\n at [Source: (io.netty.buffer.ByteBufInputStream); l
Create a converter for reading and a converter for writing, thus it will convert between your type and Postgres io.r2dbc.postgresql.codec.Json.
#ReadingConverter
#RequiredArgsConstructor
public class JsonToYourTypeConverter implements Converter<Json, YourType > {
private final ObjectMapper objectMapper;
#Override
public YourType convert(Json json) {
}
}
#WritingConverter
#RequiredArgsConstructor
public class YourTypeToJsonConverter implements Converter<YourType, Json > {
private final ObjectMapper objectMapper;
#Override
public Json convert(YourType data) {
}
}
Then register your converters via R2dbcCustomConversions bean.
#Bean
public R2dbcCustomConversions r2dbcCustomConversions(ConnectionFactory connectionFactory, ObjectMapper objectMapper) {
var dialect = DialectResolver.getDialect(connectionFactory);
var converters = List.of(
new JsonToYourTypeConverter(objectMapper),
new YourTypeToJsonConverter(objectMapper),
);
return R2dbcCustomConversions.of(dialect, converters);
}
Related
I am using spring.
I have a configured ObjectMapper for the entire project and I use it to set up a kafka deserializer.
And then I need a custom kafka deserializer to be used in KafkaListener.
I'm configuring KafkaListener via autoconfiguration, not via #Configuration class.
#Component
#RequiredArgsConstructor
public class CustomMessageDeserializer implements Deserializer<MyMessage> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public MyMessage deserialize(String topic, byte[] data) {
return objectMapper.readValue(data, MyMessage.class);
}
}
If i do like this
#KafkaListener(
topics = {"${topics.invite-user-topic}"},
properties = {"value.deserializer=com.service.deserializer.CustomMessageDeserializer"}
)
public void receiveInviteUserMessages(MyMessage myMessage) {}
I received KafkaException: Could not find a public no-argument constructor
But with public no-argument constructor in CustomMessageDeserializer class i am getting NPE because ObjectMapper = null. It creates and uses a new class, not a spring component.
#KafkaListener supports SpEL expressions.
And I think that this problem can be solved using SpEL.
Do you have any idea how to inject spring bean CustomMessageDeserializer with SpEL?
There are no easy ways to do it with SPeL.
Analysis
To get started, see the JavaDoc for #KafkaListener#properties:
/**
*
* SpEL expressions must resolve to a String ...
*/
The value of value.deserializer is used to instantiate the specified deserializer class. Let's follow the call chain:
You specify this value in the #KafkaListener annotation, then you are probably not creating a bean of the ConsumerFactory.class. So Spring creates this bean class itself - see KafkaAutoConfiguration#kafkaConsumerFactory.
Next is the creation of the returned object new DefaultKafkaConsumerFactory(...) as ConsumerFactory<?,?> using the constructor for default delivery expressions keyDeserializer/valueDeserializer = () -> null
This factory is used to create a Kafka consumer (The entry point is the constructor KafkaMessageListenerContainer#ListenerConsumer, then KafkaMessageListenerContainer.this.consumerFactory.createConsumer...)
In the KafkaConsumer constructor, the valueDeserializer object is being created, because it is null (for the default factory of point 2 above):
if (valueDeserializer == null) {
this.valueDeserializer = config.getConfiguredInstance(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, Deserializer.class);
The implementation of config.getConfiguredInstance involves instantiating your deserializer class via a parameterless constructor using reflection and your String "com.service.deserializer.CustomMessageDeserializer" class name
Solutions
To use value.deserializer with your customized ObjectMapper, you must create the ConsumerFactory bean yourself using the setValueDeserializer(...) method. This is also mentioned in the second Important part of the JSON.Mapping_Types.Important documentation
If you don't want to create a ConsumerFactory bean, and also don't have complicated logic in your deserializer (you only have return objectMapper.readValue(data, MyMessage.class);), then register DefaultKafkaConsumerFactoryCustomizer:
#Bean
// inject your custom objectMapper
public DefaultKafkaConsumerFactoryCustomizer customizeJsonDeserializer(ObjectMapper objectMapper) {
return consumerFactory ->
consumerFactory.setValueDeserializerSupplier(() ->
new org.springframework.kafka.support.serializer.JsonDeserializer<>(objectMapper));
}
In this case, you don't need to create your own CustomMessageDeserializer class (remove it) and Spring will automatically parse the message into your MyMessage.
#KafkaListener annotation should also not contains the property properties = {"value.deserializer=com.my.kafka_test.component.CustomMessageDeserializer"}. This DefaultKafkaConsumerFactoryCustomizer bean will automatically be used to configure the default ConsumerFactory<?, ?> (see the implementation of the KafkaAutoConfiguration#kafkaConsumerFactory method)
Here how it works for me:
#KafkaListener(topics = "${solr.kafka.topic}", containerFactory = "batchFactory")
public void listen(List<SolrInputDocument> docs, #Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers, Acknowledgment ack) throws IOException {...}
And then I have 2 beans defined in my Configuration
#Profile("!test")
#Bean
#Autowired
public ConsumerFactory<String, SolrInputDocument> consumerFactory(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
DefaultKafkaConsumerFactory<String, SolrInputDocument> result = new DefaultKafkaConsumerFactory<>(props);
String validatedKeyDeserializerName = KafkaMessageType.valueOf(keyDeserializerName).toString();
ZiDeserializer<SolrInputDocument> deserializer = ZiDeserializerFactory.getInstance(validatedKeyDeserializerName);
result.setValueDeserializer(deserializer);
return result;
}
#Profile("!test")
#Bean
#Autowired
public ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> batchFactory(ConsumerFactory<String, SolrInputDocument> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
factory.setBatchListener(true);
factory.setConcurrency(2);
ExponentialBackOffWithMaxRetries backoff = new ExponentialBackOffWithMaxRetries(10);
backoff.setMultiplier(3); // Default is 1.5 but this seems more reasonable
factory.setCommonErrorHandler(new DefaultErrorHandler(null, backoff));
// Needed for manual commits
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Note that the interface ZiDeserializer<SolrInputDocument> deserializeris my interface and ZiDeserializerFactory.getInstance(validatedKeyDeserializerName); returns my custom implementation of ZiDeserializer. And ZiDeserializer extends org.apache.kafka.common.serialization.Deserializer. This works for me
I am writing a spring boot command line tool that is supposed to interface with an API backend I already implemented. That API backend is built with spring data rest with the hateoas package, so it produces HAL message types.
In my CLI tool, I want to POST an entity that contains a list of other entities (one to many relation). For easier use, I wanted to use Resource types in the models to express relations and have a JSON serializer to transform the Resources into only their self hrefs.
My serializer works fine for one to one relations, but never gets calls to serialize arrays or any collection types.
This is what the API accepts when I POST an entity:
{
"property1": "value1",
"myrelation" : "http://localhohst:8080/relatedentities/1"
"mycollection": [
"http://localhost:8080/otherrelatedentities/2",
"http://localhost:8080/otherrelatedentities/3"
]
}
On the CLI side, I created a model entity in the CLI application like this:
#Getter #Setter
public class MyEntity {
private String property1;
#JsonSerialize(using = HateoasResourceIdSerializer.class)
private Resource<RelatedEnity> myrelation;
#JsonSerialize(using = HateoasResourceIdSerializer.class)
private List<Resource<OtherRelatedEntity>> mycollection;
}
I wrote this HateoasResourceIdSerializer to transform any Resource type into only its self href:
public class HateoasResourceIdSerializer extends StdSerializer<Resource<?>> {
private static final long serialVersionUID = 1L;
public HateoasResourceIdSerializer() {
this(null);
}
public HateoasResourceIdSerializer(Class<Resource<?>> t) {
super(t);
}
#Override
public void serialize(Resource<?> value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
jgen.writeString(value.getId().getHref());
}
}
Looking at the payload sent to the API backend, I can see that the "myrelation" property is set to the URL of the target entity while the "mycollection" property is always null.
I tried writing a 2nd Serializer that would accept Collection<Resource<?>> but that didnt get called either.
My expectation would be that the serializer above for Resource would be applied to arrays as well as any collection type.
EDIT:
I was asked to provide code to register serializers, so here it is. I added the two mixins as suggested in one of the answers below (hope I did it right) but did not see the expected behavior. I also assumed that due to the registration I could remove the #JsonSerialize(using = HateoasResource(s)IdSerializer.class) annotation from the properties. The current behavior is that those properties do not get rendered at all.
#SuppressWarnings("deprecation")
#SpringBootApplication
#EnableHypermediaSupport(type=EnableHypermediaSupport.HypermediaType.HAL)
public class Application extends WebMvcConfigurerAdapter implements ApplicationRunner {
public static void main(String[] args) {
SpringApplication.run(SwissArmyKnifeApplication.class, args);
}
#Override
public void run(ApplicationArguments args) throws IllegalAccessException, IllegalArgumentException, InvocationTargetException {
// ...
}
#Autowired
private HalHttpMessageConverter halHttpMessageConverter;
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.add(halHttpMessageConverter);
super.configureMessageConverters(converters);
}
}
#Configuration
public class HalHttpMessageConverter extends AbstractJackson2HttpMessageConverter {
public HalHttpMessageConverter() {
super(new ObjectMapper(), new MediaType("application", "hal+json", DEFAULT_CHARSET));
objectMapper.registerModule(new Jackson2HalModule());
objectMapper
.setHandlerInstantiator(new Jackson2HalModule.HalHandlerInstantiator(new DefaultRelProvider(), null, null));
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.addMixIn(Resource.class, ResourceMixIn.class);
objectMapper.addMixIn(Resources.class, ResourcesMixIn.class);
}
#Override
protected boolean supports(Class<?> clazz) {
return ResourceSupport.class.isAssignableFrom(clazz);
}
}
You need to properly register your custom serialiser using MixIn feature. Instead of annotating property, you need to annotate class which informs Jackson you want to use it in all scenarios, not only for MyEntity class.
Create MixIn interface:
#JsonSerialize(using = HateoasResourceIdSerializer.class)
interface ResourceMixIn {
}
And register it:
ObjectMapper mapper = JsonMapper.builder()
.addMixIn(Resource.class, ResourceMixIn.class).build();
See other questions how to configure Jackson mapper in Spring:
How do i use Jackson Json parsing in a spring project without any annotations?
Different JSON configuration in a Spring application for REST and Ajax serialization
Spring Boot custom serializer for Collection class
Spring Boot Jackson date and timestamp Format
You are not including code to indicate how you are registering serializer for your type so that could give the clue. But custom serializers definitely should be called for array, Collection and Map values as well as simple property values.
Registering separate serializer for Collection<Type> is not needed (and is actually bit trickier to do: possible, but more work due to nested type) and is not meant to be done just to support specific type in collection (but rather to support more special Collections, if any).
So please include code related to registration, as well as version of Jackson used (and obv. if not recent one, consider upgrading it first to see problem still persists).
tldr; I want to add virtual fields while serializing the JPA entity into JSON using Jackson #JsonAppend. The value of the virtual fields must be determined via service managed by Spring. How do I inject my spring-managed service inside a Jackson class?
Technologies: Spring Boot 1.5.10, Spring Data Rest, JPA 2.1, Jackson 2.8.10
Details:
I have a Spring Data managed JPA entity:
#Entity
public class Stream {
...
}
I created a Custom Jackson module with a Mixin to add #JsonAppend virtual field as below:
#Bean
public Module customModule() {
return new CustomModule();
}
#Component
class CustomModule extends SimpleModule {
CustomModule() {
setMixInAnnotation(Stream.class, StreamMixin.class);
}
#JsonAppend(
props = {
#JsonAppend.Prop(name = "canEdit", value = ABACInspector.class)
}
)
abstract class StreamMixin {}
}
The ABACInspector class extends Jackson's VirtualBeanPropertyWriter to determine the value of the virtual field canEdit. If this class does not use a Spring service (sets hard-coded value for example), it works fine and the field shows up in REST API JSON response. But autowiring a Spring bean doesn't work and the object remains null.
#Component
class ABACInspector extends VirtualBeanPropertyWriter {
#Autowired
private PermissionEvaluator permissionEvaluator;
public ABACInspector() {
}
public ABACInspector(BeanPropertyDefinition propDef, Annotations contextAnnotations, JavaType declaredType) {
super(propDef, contextAnnotations, declaredType);
}
#Override
protected Object value(Object bean, JsonGenerator gen, SerializerProvider prov) throws Exception {
Authentication authentication = SecurityContextHolder.getContext().getAuthentication();
boolean permission = permissionEvaluator.hasPermission(authentication, bean, Action.STREAM_VIEW);
System.out.println("evaluated permission is " + permission);
return permission;
}
#Override
public VirtualBeanPropertyWriter withConfig(MapperConfig<?> config, AnnotatedClass declaringClass, BeanPropertyDefinition propDef, JavaType type) {
return new ABACInspector(propDef, null, type);
}
}
Below is the NPE error (because permissionEvaluator is never injected):
{"status":"INTERNAL_SERVER_ERROR","message":"Could not write JSON:
(was java.lang.NullPointerException); nested exception is com.fasterxml.jackson.databind.JsonMappingException:
(was java.lang.NullPointerException) (through reference chain: org.springframework.data.rest.webmvc.json.PersistentEntityJackson2Module$PersistentEntityResourceSerializer$1[\"content\"]->com.example.streammanagement.Stream[\"canView\"])"
I am aware of Spring Data Rest's HalHandlerInstantiator that contains the AutowireCapableBeanFactory but I am not sure how/if that can help here. Refer DATAREST-840
Jackson internally calls withConfig function of your component to build VirtualBeanPropertyWriter.
So if you use breakpoints, you can see that first a component with injected bean is created, then withConfig function is called and new VirtualBeanPropertyWriter object is created which is used by jackson and of course does not have the injected bin (since you called the constructor manually).
So you can change it by this way:
#Component
class ABACInspector extends VirtualBeanPropertyWriter {
private PermissionEvaluator permissionEvaluator;
#Autowired
public ABACInspector(PermissionEvaluator permissionEvaluator) {
this.permissionEvaluator = permissionEvaluator;
}
public ABACInspector(BeanPropertyDefinition propDef, Annotations contextAnnotations, JavaType declaredType, PermissionEvaluator permissionEvaluator) {
super(propDef, contextAnnotations, declaredType);
this.permissionEvaluator = permissionEvaluator;
}
#Override
protected Object value(Object bean, JsonGenerator gen, SerializerProvider prov) throws Exception {
Authentication authentication = SecurityContextHolder.getContext().getAuthentication();
boolean permission = permissionEvaluator.hasPermission(authentication, bean, Action.STREAM_VIEW);
System.out.println("evaluated permission is " + permission);
return permission;
}
#Override
public VirtualBeanPropertyWriter withConfig(MapperConfig<?> config, AnnotatedClass declaringClass, BeanPropertyDefinition propDef, JavaType type) {
return new ABACInspector(propDef, null, type, permissionEvaluator);
}
}
I'm attempting to use the Jackson serialization feature of spring-data-redis. I am building a ObjectMapper and using the GenericJackson2JsonRedisSerializer as the serializer for the redisTemplate:
#Configuration
public class SampleModule {
#Bean
public ObjectMapper objectMapper() {
return Jackson2ObjectMapperBuilder.json()
.serializationInclusion(JsonInclude.Include.NON_NULL) // Don’t include null values
.featuresToDisable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS) //ISODate
.build();
}
#Bean
public RedisTemplate getRedisTemplate(ObjectMapper objectMapper, RedisConnectionFactory redisConnectionFactory){
RedisTemplate redisTemplate = new RedisTemplate();
redisTemplate.setDefaultSerializer(new GenericJackson2JsonRedisSerializer(objectMapper));
redisTemplate.setConnectionFactory(redisConnectionFactory);
return redisTemplate;
}
}
I have a SampleBean I am attempting to save:
#RedisHash("sampleBean")
public class SampleBean {
#Id
String id;
String value;
Date date;
public SampleBean(String value, Date date) {
this.value = value;
this.date = date;
}
}
And a repository for that bean:
public interface SampleBeanRepository extends CrudRepository {
}
I am then trying to write the bean to Redis:
ConfigurableApplicationContext context = SpringApplication.run(SampleRedisApplication.class, args);
SampleBean helloSampleBean = new SampleBean("hello", new Date());
ObjectMapper objectMapper = context.getBean(ObjectMapper.class);
logger.info("Expecting date to be written as: " + objectMapper.writeValueAsString(helloSampleBean.date));
SampleBeanRepository repository = context.getBean(SampleBeanRepository.class);
repository.save(helloSampleBean);
context.close();
I expect the redisTemplate to use the Serializer to write the Date inside of the SampleBean as a Timestamp, however it is written as a long.
The relevant spring-data-redis reference: http://docs.spring.io/spring-data/data-redis/docs/current/reference/html/#redis:serializer
Full code sample: https://github.com/bandyguy/spring-redis-jackson-sample-broken
The serializer/mapper used by the template does not affect the one used by the repository since the repository directly operates upon the byte[] using Converter implementations for reading/writing data based on domain type metadata.
Please refer to the Object to Hash Mapping section of the reference manual for guidance how to write and register a custom Converter.
Have you tried disable serialization feature SerializationFeature.WRITE_DATES_AS_TIMESTAMPS?
I am trying to customize the collection name where an entity class is saved into and indexed, using Spring Data MongoDB and Spring Batch. The class is declared as follows:
#Document
#CompoundIndex(name = "unique_source", def = "{'fid': 1, 'sid': 1}", unique = true, background = true)
public class VariantSource {
...
}
And the item writer:
public class VariantSourceMongoWriter extends MongoItemWriter<VariantSource> {
public VariantSourceEntityMongoWriter(MongoOperations mongoOperations, String collectionName) {
setTemplate(mongoOperations);
setCollection(collectionName);
}
}
Saving works fine: the objects are written into the collection provided as argument. The problem is that the indexes are created in the default collection, named after the class name (variantSource).
After reading this and this, I created the following:
public class MongoCollections {
public String getCollectionFilesName() {
return "my_custom_collection_name"; // TODO Dynamic value
}
}
#Configuration
public class MongoCollectionsConfiguration {
#Bean
public MongoCollections mongoCollections() {
return new MongoCollections();
}
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = {MongoCollectionsConfiguration.class})
public class VariantSourceMongoWriterTest {
#Autowired
private MongoCollections mongoCollections;
}
I have checked the instance is correctly autowired into the unit tests, but I can't make it work with SpEL.
After changing the #Document annotation to look like this:
#Document(collection = "#{#mongoCollections.getCollectionFilesName()}")
the following exception is thrown:
org.springframework.expression.spel.SpelEvaluationException: EL1057E:(pos 1): No bean resolver registered in the context to resolve access to bean 'mongoCollections'
And if I use this:
#Document(collection = "#{mongoCollections.getCollectionFilesName()}")
the exception is this one:
org.springframework.expression.spel.SpelEvaluationException: EL1007E:(pos 0): Property or field 'mongoCollections' cannot be found on null
Finally, the following creates a collection with the name as specified, symbols included:
#Document(collection = "#mongoCollections.getCollectionFilesName()")
As pointed by this answer, to fix the injection:
#Document(collection = "#{mongoCollections.getCollectionFilesName()}")
SpelEvaluationException: EL1007E:(pos 0): Property or field 'mongoCollections' cannot be found on null
(or a direct method bean: #Document(collection = "#{getCollectionFilesName}")), try setting the ApplicationContext into the MongoMappingContext (which is used to instantiate the MongoConverter, and later the MongoTemplate):
#Bean
public MongoMappingContext MongoMappingContext() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setApplicationContext(applicationContext);
return mappingContext;
}
Make sure that your bean mongoCollections is registered in the application context,
and also correct the SpEL expression as below.
#Document(collection = "#{#mongoCollections.getCollectionFilesName()}")
I was able to get my #Document tag to access a bean by simply changing my MongoTemplate configuration file.
Previously, I had it set up like this:
#Configuration
public class MongoTemplateConfiguration {
...
#Bean
public MongoTemplate mongoTemplate() {
...
return new MongoTemplate(...);
}
}
Changing it to follow this (3.2 Java Configuration) format was all I needed in order to remove the "bean resolver" error:
#Configuration
public class MongoTemplateConfiguration extends AbstractMongoClientConfiguration {
...
#Override
#Bean
public com.mongodb.client.MongoClient mongoClient() {
MongoClientSettings settings = ...;
return MongoClients.create(settings);
}
}