Spring Boot / Kafka Json Deserialization - Trusted Packages - spring

I am just starting to use Kafka with Spring Boot & want to send & consume JSON objects.
I am getting the following error when I attempt to consume an message from the Kafka topic:
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition dev.orders-0 at offset 9903. If needed, please seek past the record to continue consumption.
Caused by: java.lang.IllegalArgumentException: The class 'co.orders.feedme.feed.domain.OrderItem' is not in the trusted packages: [java.util, java.lang]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:139) ~[spring-kafka-2.1.5.RELEASE.jar:2.1.5.RELEASE]
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.toJavaType(DefaultJackson2JavaTypeMapper.java:113) ~[spring-kafka-2.1.5.RELEASE.jar:2.1.5.RELEASE]
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:218) ~[spring-kafka-2.1.5.RELEASE.jar:2.1.5.RELEASE]
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:923) ~[kafka-clients-1.0.1.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.access$2600(Fetcher.java:93) ~[kafka-clients-1.0.1.jar:na]
I have attempted to add my package to the list of trusted packages by defining the following property in application.properties:
spring.kafka.consumer.properties.spring.json.trusted.packages = co.orders.feedme.feed.domain
This doesn't appear to make any differences. What is the correct way to add my package to the list of trusted packages for Spring's Kafka JsonDeserializer?

Since you have the trusted package issue solved, for your next problem you could take advantage of the overloaded
DefaultKafkaConsumerFactory(Map<String, Object> configs,
Deserializer<K> keyDeserializer,
Deserializer<V> valueDeserializer)
 and the JsonDeserializer "wrapper" of spring kafka
JsonDeserializer(Class<T> targetType, ObjectMapper objectMapper)
Combining the above, for Java I have:
new DefaultKafkaConsumerFactory<>(properties,
new IntegerDeserializer(),
new JsonDeserializer<>(Foo.class,
new ObjectMapper()
.registerModules(new KotlinModule(), new JavaTimeModule()).setSerializationInclusion(JsonInclude.Include.NON_NULL)
.setDateFormat(new ISO8601DateFormat()).configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false))));
Essentially, you can tell the factory to use your own Deserializers and for the Json one, provide your own ObjectMapper. There you can register the Kotlin Module as well as customize date formats and other stuff.

Ok, I have read the documentation in a bit more detail & have found an answer to my question. I am using Kotlin so the creation of my consumer looks like this with the
#Bean
fun consumerFactory(): ConsumerFactory<String, FeedItem> {
val configProps = HashMap<String, Any>()
configProps[ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG] = bootstrapServers
configProps[ConsumerConfig.GROUP_ID_CONFIG] = "feedme"
configProps[ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG] = StringDeserializer::class.java
configProps[ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG] = JsonDeserializer::class.java
configProps[JsonDeserializer.TRUSTED_PACKAGES] = "co.orders.feedme.feed.domain"
return DefaultKafkaConsumerFactory(configProps)
}
Now I just need a way to override the creation of the Jackson ObjectMapper in the JsonDeserializer so that it can work with my Kotlin data classes that don't have a zero-argument constructor :)

Related

Question about RedisTemplate and RedisSerializers

I am learning Spring Data Redis, and need some help with the RedisTemplate:
Looking at the javadoc of RedisTemplate, I can see the following serializer setters:
public void setKeySerializer(RedisSerializer<?> serializer)
public void setHashKeySerializer(RedisSerializer<?> hashKeySerializer)
public void setValueSerializer(RedisSerializer<?> serializer)
public void setHashValueSerializer(RedisSerializer<?> hashValueSerializer)
I wonder when is the framework using which of the serializers? What's the difference between the ones with 'Hash" and the ones without?
In order to use Redis Spring Data Repositories where secondary index matching is supported, I should provide a RedisTemplate with default setting and name it redisTemplate, right?
What happens when I need both pub/sub with custom message body format (say json) and Spring Data repository in the same app? Should I define a default RedisTemplate (like point 2) for the repositories and create my own configuration class to build the message publisher instead of registering another RedisTemplate with Jackson2JsonRedisSerializer?
Thanks in advance for the help!

AEM - after upgrade to JDK11 I can no longer pass class parameter to the scheduled job

After upgrade to JDK11 I'm no longer able to run some of my AEM 6.5 Sling jobs. It seems there is some problem with visibility of class that is used to pass parameters to the job.
Here is how the job is prepared and scheduled:
final Map<String, Object> props = new HashMap<String, Object>();
props.put("stringParam", "something");
props.put("classParam", new Dto());
Job job = jobManager.addJob("my/special/jobtopic", props);
The jobs is not started, as it seems there is any problem during job start, during parameters setup.
The stringParam is ok, but classParam usage throws following exception:
28.01.2022 17:28:25.978 *WARN* [sling-oak-observation-17] org.apache.sling.event.impl.jobs.queues.QueueJobCache
Unable to read job from /var/eventing/jobs/assigned/.../my.package.myJob/2022/1/27/15/50/...
java.lang.Exception: Unable to deserialize property 'classParam'
at org.apache.sling.event.impl.support.ResourceHelper.cloneValueMap(ResourceHelper.java:218)
at org.apache.sling.event.impl.jobs.Utility.readJob(Utility.java:181)
...
Caused by: java.lang.ClassNotFoundException: my.package.Dto
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
at org.apache.sling.launchpad.base.shared.LauncherClassLoader.loadClass(LauncherClassLoader.java:160)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at org.apache.felix.framework.BundleWiringImpl.doImplicitBootDelegation(BundleWiringImpl.java:1817)
I'm pretty sure that the Dto class is visible and exported from my OSGI bundle, it can be used and consumed from another bundles. But for some reason internal sling logic is unable to resolve it. How can I make my Dto class accessible to internal Sling logic?
Any idea why does this happen and how to solve it?
The java.lang.ClassNotFoundException exception is misleading in this case.
The true reason for this problem is "Java Serialization Filter" that was added in JDK 9. It affects object deserialization rules.
I tried to do parameter serialization / deserialization myself and pass serialized object in base64 string:
String serializedString = job.getProperty("dto", String.class);
byte [] serializedBytes = Base64.getDecoder().decode(serializedString);
ByteArrayInputStream bais = new ByteArrayInputStream(serializedBytes);
ObjectInputStream ois = new ObjectInputStream(bais);
dtoParam = (Dto)ois.readObject();
Job was scheduled and run, however the result was java.io.InvalidClassException: filter status: REJECTED
It helped to find the true cause:
AEM implementation uses internal deserialization filter com.adobe.cq.deserfw.impl.FirewallSerialFilter that could be configured in OSGI Felix console. The component name is com.adobe.cq.deserfw.impl.DeserializationFirewallImpl.name.
Here add your class or package name.

How to configure multiple database-platforms in spring boot

I have got a Spring Boot project with two data sources, one DB2 and one Postgres. I configured that, but have a problem:
The auto-detection for the database type does not work on the DB2 (in any project) unless I specify the database dialect using spring.jpa.database-platform = org.hibernate.dialect.DB2390Dialect.
But how do I specify that for only one of the database connections? Or how do I specify the other one independently?
Additional info to give you more info on my project structure: I seperated the databases roughly according to this tutorial, although I do not use the ChainedTransactionManager: https://medium.com/preplaced/distributed-transaction-management-for-multiple-databases-with-springboot-jpa-and-hibernate-cde4e1b298e4
I use the same basic project structure and almost unchanged configuration files.
Ok, I found the answer myself and want to post it for the case that anyone else has the same question.
The answer lies in the config file for each database, i.e. the DB2Config.java file mentioned in the tutorial mentioned in the question.
While I'm at it, I'll inadvertedly also answer the question "how do I manipulate any of the spring.jpa properties for several databases independently".
In the example, the following method gets called:
#Bean
public LocalContainerEntityManagerFactoryBean db2EntityManagerFactory(
#Qualifier(“db2DataSource”) DataSource db2DataSource,
EntityManagerFactoryBuilder builder
) {
return builder
.dataSource(db2DataSource)
.packages(“com.preplaced.multidbconnectionconfig.model.db2”)
.persistenceUnit(“db2”)
.build();
}
While configuring the datasource and the package our model lies in, we can also inject additional configuration.
After calling .packages(...), we can set a propertiesMap that can contain everything we would normally set via spring.jpa in the application.properties file.
If we want to set the DB2390Dialect, the method could now look like this (with the possibility to easily add further configuration):
#Bean
public LocalContainerEntityManagerFactoryBean db2EntityManagerFactory(
#Qualifier(“db2DataSource”) DataSource db2DataSource,
EntityManagerFactoryBuilder builder
) {
HashMap<String, String> propertiesMap = new HashMap<String, String>();
propertiesMap.put("hibernate.dialect", "org.hibernate.dialect.DB2390Dialect");
return builder
.dataSource(db2DataSource)
.packages(“com.preplaced.multidbconnectionconfig.model.db2”)
.properties(propertiesMap)
.persistenceUnit(“db2”)
.build();
}
Note that "hibernate.dialect" seems to work (instead of "database-platform").

How to get Spring to actually use configured LocalTime serializer?

I'm trying to get Spring (Spring Boot 2.2.4) to serialize LocalTime instances as "hh:mm:ss" instead of arrays [h, m, s]. So far, I've tried:
Setting spring.jackson.serialization.WRITE_DATES_AS_TIMESTAMP=false in application.properties
Overriding the ObjectMapper config:
#Bean
#Primary // needed? Who knows
open fun mapper(): ObjectMapper {
val mapper = ObjectMapper()
val javaTimeModule = JavaTimeModule()
// attempt 1
javaTimeModule.addSerializer(LocalTime::class.java, myCustomSerializer)
mapper.registerModule(javaTimeModule)
// attempt 2
mapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
return mapper
}
For attempt 1, I can see the module being registered, but if I override the serialize or serializeWithType methods, those are never called.
Attempt 2 simply has no effect.
How do I get Spring to actually call the custom ObjectMapper? Even a hint on what I could try to see why it's not being called could help.
When defining an ObjectMapper yourself or when adding #EnableWebMvc to your configuration this will stop Spring Boot from configuring its ObjectMapper. In turn it also renders the spring.jackson (and other spring.mvc, spring.multipartetc) properties useless as it is now assumed you manually configure things.
Basically you shouldn't have either of those and thus adding spring.jackson.serialization.write-dates-as-timestamps=false should all that needs to be done.
If that doesn't work, cheeck for #EnableWebMvc and/or a pre-configured ObjectMapper.

Jackson deserializing custom classes in an OSGi environment

I have some trouble using Jackson 2.1 in an OSGi environment, when deserializing a class that way:
ObjectMapper mapper = new ObjectMapper();
User user = mapper.readValue(new File("user.json"), User.class);
class User {
public Class clazz = org.example.MyClass.class;
}
Because Jackson is in a different bundle as my custom classes I want to deserialize, I often get a java.lang.ClassNotFoundException - usually on MyClass1 or MyClass2.
I traced it back to the class com.fasterxml.jackson.databind.util.ClassUtil which uses Class.forName(..) for retrieving a class for deserializing. Because of the different class-loaders on OSGI it only sees the classes of the JRE and of Jackson but not my custom classes.
Is there a simple way to make Jackson find all the required custom classes (I have dozens of them), e.g by adding a class-loader?
As the client of Jackson you have visibility of the classes that you want to deserialize into. The trick is to pass these classes into Jackson, rather than force Jackson to use dynamic reflection to find the classes.
The Jackson documentation indicates that the method ObjectMapper.readValue can take a Class object as its parameter. If you use this method then Jackson should not need to call Class.forName(). The docs give the following example:
ObjectMapper mapper = new ObjectMapper();
User user = mapper.readValue(new File("user.json"), User.class);
Here, User is the domain class which is visible to your client but not to Jackson. This invocation should work fine in OSGi... if it does not then I would suggest Jackson may have a bug.

Resources