Spring boot clouyd stream with different input and output type - apache-kafka-streams

IN Spring boot Kafka STream I have following KStream:
Function<KStream<String,InputType>, KStream<String, OutputType>> process() {
}
So here input is of InputType object and output content is OutputType object. For this I want to write a custom serde. What my understanding is that in Serde ser/der should be of same data type. SO how can I write a serde that accept one object and send other object. OR SHoul I consider custom ser/deser in this case?

You need to create two custom Serde implementations - something like
InputTypeSerde implements Serde<InputType> and OutputTypeSerde implements Serde<OutputType>. Then you can create two separate beans for them in your application. Binder should pick them up and assign them to the proper input and output bindings.

Related

Facing issues while using spring cassandra - UDT Type, not able to map udt type to udtvalue

Could not inline literal of type AddressUDT. This happens because the driver doesn't know how to map it to a CQL type. Try passing a TypeCodec or CodecRegistry to literal().; nested exception is java.lang.IllegalArgumentException: Could not inline literal of type AddressUDT. This happens because the driver doesn't know how to map it to a CQL type. Try passing a TypeCodec or CodecRegistry to literal().",
You need to define a custom converter so Cassandra knows how to map your UDT to a CQL type.
You need to create a class that extends AbstractCassandraConfiguration to do custom conversions. Have a look at the Spring Data documentation on Custom Converters for details and examples. Cheers!

How to read specific data from message from kafka with Spring-kafka #KafkaListener?

My task is to read events from multiple different topics (class of all data in all topics is "Event"). This class contains field "data" (Map) which carries specific for each topic data, that can be deserialized to specific class (e.g. to "DeviceCreateEvent" or smth.). I can create consumers for each topic with #KafkaListener on methods with parameter type "Event". But in this case firstly i have to event.getData() and deserialize it into specific class, so I will get code duplication in all consumer methods. Is there any way to get in annotated consumer method already deserialized object to specific class?
It's not clear what you are asking.
If you have a different #KafkaListener for each topic/event type, and use JSON, the framework will automatically tell the message converter the type the data should be converted to; see the documentation.
Although the Serializer and Deserializer API is quite simple and flexible from the low-level Kafka Consumer and Producer perspective, you might need more flexibility at the Spring Messaging level, when using either #KafkaListener or Spring Integration. To let you easily convert to and from org.springframework.messaging.Message, Spring for Apache Kafka provides a MessageConverter abstraction with the MessagingMessageConverter implementation and its JsonMessageConverter (and subclasses) customization. You can inject the MessageConverter into a KafkaTemplate instance directly and by using AbstractKafkaListenerContainerFactory bean definition for the #KafkaListener.containerFactory() property. The following example shows how to do so: ...
On the consumer side, you can configure a JsonMessageConverter; it can handle ConsumerRecord values of type byte[], Bytes and String so should be used in conjunction with a ByteArrayDeserializer, BytesDeserializer or StringDeserializer. (byte[] and Bytes are more efficient because they avoid an unnecessary byte[] to String conversion). You can also configure the specific subclass of JsonMessageConverter corresponding to the deserializer, if you so wish.

Using a custom ObjectMapper for Spring XD Json to Java Conversion

Is there an easy way to convert a JSON payload to a Java object using a custom ObjectMapper (Jackson) or do I have to provide a custom type converter. I know that I could use a processor, but somehow it would be nice to use input and output types of the stream definition.
In the second case: Am I even able to provide a custom type converter for application/json to Java?
The documentation states: "The customMessageConverters are added after the standard converters in the order defined. So it is generally easier to add converters for new media types than to replace existing converters."
I bet that there is an existing "application/json" converter - but at a first glance I could not find further information if it is even possible to replace existing converters.
Thanks!
Peter
If you look at streams.xml You can see the relevant configuration. The configured lists are used to construct a CompositeMessageConverter which visits every MessageConverter in list order until it finds one that can do the conversion and returns a non-null result. A CompositeConverter instance is created for each module instance that is configured for conversion (i.e., defines an inputType or outputType value) by filtering the list of candidate message converters, which all inherit AbstractFromMessageConverter. The list is paired down to those which respond true to public boolean supportsTargetMimeType(MimeType mimeType) (where mimeType is the value of the input/outputType). The CompositeMessageConverter is injected into the corresponding MessageChannel and converts the payload.
There are a couple of things you can do. You can override the xd.messageConverters bean definition. For example, you can replace JsonToPojoMessageConverter and PojoToJsonMessageConverter with your own subclasses. You can also insert your own implementations in the list before the above converters and have your implementation match only specific domain objects for which you need a custom JSON mapper.
Another possibility is to define your own mime type and provide converters for that mime type as customMessageConverters. In any case, follow these guidelines forextending Spring XD

Assign ArrayList from the data in propeties file

this is my property file.
REDCA_IF_00001=com.sds.redca.biz.svc.RedCAIF00001SVC
REDCA_IF_00002=com.sds.redca.biz.svc.RedCAIF00002SVC
REDCA_IF_00003=com.sds.redca.biz.svc.RedCAIF00003SVC
REDCA_IF_00004=com.sds.redca.biz.svc.RedCAIF00004SVC
and I want to these values into hashmap in my spring context file.
How can I achieve this?
Does it have to be a HashMap or any kind of Map would be fine?
Because you can define that as a java.util.Properties instance (Spring has great support for properties loading), which already implements Map (it actually extends from Hashtable).

Use JSON deserializer for Batch job execution context

I'm trying to get a list of job executions which have been stored in Spring batch related tables in the database using:
List<JobExecution> jobExecutions = jobExplorer.getJobExecutions(jobInstance);
The above method call seems to invoke ExecutionContextRowMapper.mapRow method in JdbcExecutionContextDao class.
The ExecutionContextRowMapper uses com.thoughtworks.xstream.Xstream.fromXML method to deserialize the JSON string of JobExecutionContext stored in DB.
It looks like an incorrect or default xml deserializer is used for unmarshalling JSONified JobExecutionContext.
Is there any configuration to use a JSON deserializer in this scenario.
The serializer/deserializer for the ExecutionContext is configurable in 2.2.x. We use the ExecutionContextSerializer interface (providing two implementations, one using java serialization and one using the XStream impl you mention). To configure your own serializer, you'll need to implement the org.springframework.batch.core.repository.ExecutionContextSerializer and inject it into the JobRepositoryFactoryBean (so that the contexts are serialized/deserialized correctly) and the JobExplorerFactoryBean (to reserialize the previously saved contexts).
It is important to note that changing the serialization method will prevent Spring Batch from deserializing previously saved ExecutionContexts.

Resources