How to configure Spring cloud stream (kafka) to use protobuf as serialization - protocol-buffers

I am using Spring cloud stream (kafka) to exchange messages between producer and consumer microservices.
It exchanges data with native java serialization. As per Spring cloud documentation, It supports JSON,AVRO serialization.
Is any one tried protobuf serialization (message converter) in spring cloud stream
---------------- Later Added
I wrote this MessageConverter
public class ProtobufMessageConverter<T extends AbstractMessage > extends AbstractMessageConverter
{
private Parser<T> parser;
public ProtobufMessageConverter(Parser<T> parser)
{
super(new MimeType("application", "protobuf"));
this.parser = parser;
}
#Override
protected boolean supports(Class<?> clazz)
{
if (clazz != null)
{
return EquipmentProto.Equipment.class.isAssignableFrom(clazz);
}
return true;
}
#Override
public Object convertFromInternal(Message<?> message, Class<?> targetClass, Object conversionHint)
{
if (!(message.getPayload() instanceof byte[]))
{
return null;
}
try
{
// return EquipmentProto.Equipment.parseFrom((byte[]) message.getPayload());
return parser.parseFrom((byte[]) message.getPayload());
}
catch (Exception e)
{
this.logger.error(e.getMessage(), e);
}
return null;
}
#Override
protected Object convertToInternal(Object payload, MessageHeaders headers, Object conversionHint)
{
return ((AbstractMessage) payload).toByteArray();
}
}

It's really not a question of trying but rather just doing it, since converters are a natural extension mechanism (inherited fro spring-integration) in spring-cloud-stream that exists specifically to address these concerns. So yes, you can add your own custom converter.
Also, keep in mind that with Kafka there is also a concept of native serde, so you need to make sure that the two do not create some conflict.

Related

Receiving non-unicode REST POST call on SpringBoot 1.5.10 with Jackson2

Spring boot with Jackson2 assumes any JSON request will be Unicode and fails to tolerate non ascii characters if the encoding is not Unicode.
I saw that might be different by using GSON instead of Jackson2 but I want to try to stick to Jackson2.
Jackson2 supports any Java supported encoding and SpringBoot supports handling any of those encodings too but when working together they assume Unicode.
SpringBoot will assume all requests are UTF-8 unless you dissactivate that behaviour:
server.servlet.encoding.force-request=false
But then the method org.springframework.http.converter.json.AbstractJackson2HttpMessageConverter.read(Type, Class<?>, HttpInputMessage) doesn't have access to the request but it can access the headers in HttpInputMessage but it doesn't on SpringBoot 1.5.10.
It passes the InputStream to Jackson2 without encoding specification and it assumes it's Unicode.
The solution would be to create a InputStreamReader with the encoding you can find in the headers.
It seem's that's the actual behaviour in the current version of SpringBoot but I wonder if I can override the old one in SpringBoot 1.5.10 somehow.
I can extend the class MappingJackson2HttpMessageConverter but I don't know how to make SpringBoot to use the new converter instead of the default one for Jackson2.
I could mess with the classpath to override the whole AbstractJackson2HttpMessageConverter with a custom version but I wouldn't like that as it might break things if I make a fat-jar or a war and maybe other ways too.
I found a way to customize MappingJackson2HttpMessageConverter by registering a new bean so I created this configuration class to register an instance of a subclass with the charset behaviour fixed, creating and using a Reader when the charset is not Unicode.
/**
* Configuration class to override the default
* MappingJackson2HttpMessageConverter of SpringBoot 1.5 that fails to use the
* request charset to parse JSON post bodies.
*/
#Configuration
public class Jackson2CharsetSupportConfig {
private final class MappingJackson2HttpMessageConverterExtension extends MappingJackson2HttpMessageConverter {
private MappingJackson2HttpMessageConverterExtension(ObjectMapper objectMapper) {
super(objectMapper);
}
#Override
protected Object readInternal(Class<?> clazz, HttpInputMessage inputMessage)
throws IOException, HttpMessageNotReadableException {
JavaType javaType = getJavaType(clazz, null);
return readJavaType(javaType, inputMessage);
}
#Override
public Object read(Type type, Class<?> contextClass, HttpInputMessage inputMessage)
throws IOException, HttpMessageNotReadableException {
JavaType javaType = getJavaType(type, contextClass);
return readJavaType(javaType, inputMessage);
}
private Object readJavaType(JavaType javaType, HttpInputMessage inputMessage) {
try {
MediaType contentType = inputMessage.getHeaders().getContentType();
Charset charset = getCharset(contentType);
boolean unicode = charset.name().toUpperCase().startsWith("UTF-");
if (inputMessage instanceof MappingJacksonInputMessage) {
Class<?> deserializationView = ((MappingJacksonInputMessage) inputMessage).getDeserializationView();
if (deserializationView != null) {
if (unicode) {
return this.objectMapper.readerWithView(deserializationView).forType(javaType).
readValue(inputMessage.getBody());
} else {
return this.objectMapper.readerWithView(deserializationView).forType(javaType).
readValue(asReader(inputMessage.getBody(), charset));
}
}
}
if (unicode) {
return this.objectMapper.readValue(inputMessage.getBody(), javaType);
} else {
return this.objectMapper.readValue(asReader(inputMessage.getBody(), charset), javaType);
}
}
catch (JsonProcessingException ex) {
throw new HttpMessageNotReadableException("JSON parse error: " + ex.getOriginalMessage(), ex);
}
catch (IOException ex) {
throw new HttpMessageNotReadableException("I/O error while reading input message", ex);
}
}
protected Charset getCharset(MediaType contentType) {
if (contentType != null && contentType.getCharset() != null) {
return contentType.getCharset();
}
else {
return StandardCharsets.UTF_8;
}
}
protected Reader asReader(InputStream is, Charset charset) {
return new InputStreamReader(is, charset);
}
}
#Autowired
private GenericWebApplicationContext webApplicationContext;
#Bean
public MappingJackson2HttpMessageConverter mappingJackson2HttpMessageConverter() {
return new MappingJackson2HttpMessageConverterExtension(Jackson2ObjectMapperBuilder.json().applicationContext(webApplicationContext).build());
}
}

creating Opentelemetry Context using trace-id and span-id of remote parent

I have micro service which support open tracing and that injecting trace-id and span-id in to header. Other micro service support open telemetry. how can I create parent span using trace-id and span-id in second micro service?
Thanks,
You can use W3C Trace Context specifications to achieve this. We need to send traceparent(Ex: 00-8652a752089f33e2659dff28d683a18f-7359b90f4355cfd9-01) from producer via HTTP headres ( or you can create it using the trace-id and span-id in the consumer). Then we can extract the remote context and create the span with traceparent.
This is the consumer controller. TextMapGetter used to map that traceparent data to the Context. ExtractModel is just a custom class.
#GetMapping(value = "/second")
public String sencondTest(#RequestHeader(value = "traceparent") String traceparent){
try {
Tracer tracer = openTelemetry.getTracer("cloud.events.second");
TextMapGetter<ExtractModel> getter = new TextMapGetter<>() {
#Override
public String get(ExtractModel carrier, String key) {
if (carrier.getHeaders().containsKey(key)) {
return carrier.getHeaders().get(key);
}
return null;
}
#Override
public Iterable<String> keys(ExtractModel carrier) {
return carrier.getHeaders().keySet();
}
};
ExtractModel model = new ExtractModel();
model.addHeader("traceparent", traceparent);
Context extractedContext = openTelemetry.getPropagators().getTextMapPropagator()
.extract(Context.current(), model, getter);
try (Scope scope = extractedContext.makeCurrent()) {
// Automatically use the extracted SpanContext as parent.
Span serverSpan = tracer.spanBuilder("CloudEvents Server")
.setSpanKind(SpanKind.SERVER)
.startSpan();
try {
Thread.sleep(150);
} finally {
serverSpan.end();
}
}
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
return "Server Received!";
}
Then when we configuring the OpenTelemetrySdk need to set W3CTraceContextPropagator in Context Propagators.
// Use W3C Propagator(to extract span from HTTP headers) since we use the W3C specifications
TextMapPropagator textMapPropagator = W3CTraceContextPropagator.getInstance();
OpenTelemetrySdk openTelemetrySdk = OpenTelemetrySdk.builder()
.setTracerProvider(tracerProvider)
.setPropagators(ContextPropagators.create(textMapPropagator))
.buildAndRegisterGlobal();
Here is my customer ExtractModel class
public class ExtractModel {
private Map<String, String> headers;
public void addHeader(String key, String value) {
if (this.headers == null){
headers = new HashMap<>();
}
headers.put(key, value);
}
public Map<String, String> getHeaders() {
return headers;
}
public void setHeaders(Map<String, String> headers) {
this.headers = headers;
}
}
You can find more details in the official documentation for manual instrumentation.
Generally you have to propogate the span-id and trace-id if it is available in header. Any request you get in your microservice, check if the headers have span-id and trace-id in them. If yes,extract them and use them in your service.
If it is not present then you create a new one and use it in your service and also add it to requests that go out of your microservice.

Spring ws - Datahandler with Swaref still null

I used the Spring boot starter web services to develop a SOAP with attachment service.
For an unknown reason attachments aren't unmarshalled.. Jaxb Unmarshaller is used but the property AttachmentUnmarshaller inside is "null" ...so probably the reason why DataHandler unmarshalling isn't done ??
As in a JEE environment the attachmentUnmarshaller is handle by jaxws .. how configure it in a standalone process like spring boot on tomcat ??
Java version : 8_0_191
Spring boot version : 2.1
I faced similar issue, but with marshalling.
Jaxb2Marshaller has its own implementations of AttachmentMarshaller and AttachmentUnarshaller. But for these to work, mtomEnabled property should be set to true. If it's not, defaults will be used, which are not instantiated.
Try setting setMtomEnabled(true) on your Jaxb2Marshaller.
This will probably solve your issue.
For people, who encounter same issue with marshalling - it's a bit more complicated. Jaxb2 AttachmentMarshaller is not correctly implemented as per WS-I Attachment Profile 1.0 - http://www.ws-i.org/Profiles/AttachmentsProfile-1.0.html#Example_Attachment_Description_Using_swaRef
You will have to override marshalling behavior of Jaxb2Marshaller then.
Notice: this solution assumes that MTOM is always disabled.
#Configuration
class SOAPConfiguration {
#Bean
public Jaxb2Marshaller jaxb2Marshaller() {
Jaxb2Marshaller marshaller = new Jaxb2Marshaller() {
#Override
public void marshal(Object graph, Result result, #Nullable MimeContainer mimeContainer) throws XmlMappingException {
try {
javax.xml.bind.Marshaller marshaller = createMarshaller();
if (mimeContainer != null) {
marshaller.setAttachmentMarshaller(
new SwaRefAttachmentMarshaller(mimeContainer)
);
marshaller.marshal(graph, result);
} else {
super.marshal(graph, result, null);
}
} catch (JAXBException ex) {
throw convertJaxbException(ex);
}
}
};
marshaller.setPackagesToScan("my.package");
marshaller.setMtomEnabled(false);
return marshaller;
}
private class SwaRefAttachmentMarshaller extends AttachmentMarshaller {
private final MimeContainer mimeContainer;
private SwaRefAttachmentMarshaller(MimeContainer mimeContainer) {
this.mimeContainer = mimeContainer;
}
#Override
public String addMtomAttachment(DataHandler data, String elementNamespace, String elementLocalName) {
return null;
}
#Override
public String addMtomAttachment(byte[] data, int offset, int length, String mimeType, String elementNamespace, String elementLocalName) {
return null;
}
#Override
public String addSwaRefAttachment(DataHandler data) {
String attachmentId = UUID.randomUUID().toString();
mimeContainer.addAttachment("<" + attachmentId + ">", data);
return "cid:" + attachmentId;
}
}
}

Spring Integration server with Java DSL

I am looking for an example of a Spring Integration 4.3.14 TCP server that responds to a message using the Java DSL not XML.
The 4.3.14 requirment is set by corporate policy which also avoids XML.
The end requirment is to receive a formated text payload form a PLC and respond with likewise. The PLC code is legacy and not at all well defined and simular payloads can have diferent formats.
The easy way to deal with the input payload is to treat it as a string and deal with it in Java code.
I have a basic recive working but cant work out how to send the reply, read a lot of examples and such but now think the mind is just confued so a simple working example would be ideal.
Many thanks
Here you go...
#SpringBootApplication
public class So50412811Application {
public static void main(String[] args) {
SpringApplication.run(So50412811Application.class, args).close();
}
#Bean
public TcpNetServerConnectionFactory cf() {
return new TcpNetServerConnectionFactory(1234);
}
#Bean
public TcpInboundGateway gateway() {
TcpInboundGateway gw = new TcpInboundGateway();
gw.setConnectionFactory(cf());
return gw;
}
#Bean
public IntegrationFlow flow() {
return IntegrationFlows.from(gateway())
.transform(Transformers.objectToString())
.<String, String>transform(String::toUpperCase)
.get();
}
// client
#Bean
public ApplicationRunner runner() {
return args -> {
Socket socket = SocketFactory.getDefault().createSocket("localhost", 1234);
socket.getOutputStream().write("foo\r\n".getBytes()); // default CRLF deserializer
InputStream is = socket.getInputStream();
int in = 0;
while (in != 0x0a) {
in = is.read();
System.out.print((char) in);
}
socket.close();
};
}
}

Read Application Object from GemFire using Spring Data GemFire. Data stored using SpringXD's gemfire-json-server

I'm using the gemfire-json-server module in SpringXD to populate a GemFire grid with json representation of “Order” objects. I understand the gemfire-json-server module saves data in Pdx form in GemFire. I’d like to read the contents of the GemFire grid into an “Order” object in my application. I get a ClassCastException that reads:
java.lang.ClassCastException: com.gemstone.gemfire.pdx.internal.PdxInstanceImpl cannot be cast to org.apache.geode.demo.cc.model.Order
I’m using the Spring Data GemFire libraries to read contents of the cluster. The code snippet to read the contents of the Grid follows:
public interface OrderRepository extends GemfireRepository<Order, String>{
Order findByTransactionId(String transactionId);
}
How can I use Spring Data GemFire to convert data read from the GemFire cluster into an Order object?
Note: The data was initially stored in GemFire using SpringXD's gemfire-json-server-module
Still waiting to hear back from the GemFire PDX engineering team, specifically on Region.get(key), but, interestingly enough if you annotate your application domain object with...
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public class Order ... {
...
}
This works!
Under-the-hood I knew the GemFire JSONFormatter class (see here) used Jackson's API to un/marshal (de/serialize) JSON data to and from PDX.
However, the orderRepository.findOne(ID) and ordersRegion.get(key) still do not function as I would expect. See updated test class below for more details.
Will report back again when I have more information.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = GemFireConfiguration.class)
#SuppressWarnings("unused")
public class JsonToPdxToObjectDataAccessIntegrationTest {
protected static final AtomicLong ID_SEQUENCE = new AtomicLong(0l);
private Order amazon;
private Order bestBuy;
private Order target;
private Order walmart;
#Autowired
private OrderRepository orderRepository;
#Resource(name = "Orders")
private com.gemstone.gemfire.cache.Region<Long, Object> orders;
protected Order createOrder(String name) {
return createOrder(ID_SEQUENCE.incrementAndGet(), name);
}
protected Order createOrder(Long id, String name) {
return new Order(id, name);
}
protected <T> T fromPdx(Object pdxInstance, Class<T> toType) {
try {
if (pdxInstance == null) {
return null;
}
else if (toType.isInstance(pdxInstance)) {
return toType.cast(pdxInstance);
}
else if (pdxInstance instanceof PdxInstance) {
return new ObjectMapper().readValue(JSONFormatter.toJSON(((PdxInstance) pdxInstance)), toType);
}
else {
throw new IllegalArgumentException(String.format("Expected object of type PdxInstance; but was (%1$s)",
pdxInstance.getClass().getName()));
}
}
catch (IOException e) {
throw new RuntimeException(String.format("Failed to convert PDX to object of type (%1$s)", toType), e);
}
}
protected void log(Object value) {
System.out.printf("Object of Type (%1$s) has Value (%2$s)", ObjectUtils.nullSafeClassName(value), value);
}
protected Order put(Order order) {
Object existingOrder = orders.putIfAbsent(order.getTransactionId(), toPdx(order));
return (existingOrder != null ? fromPdx(existingOrder, Order.class) : order);
}
protected PdxInstance toPdx(Object obj) {
try {
return JSONFormatter.fromJSON(new ObjectMapper().writeValueAsString(obj));
}
catch (JsonProcessingException e) {
throw new RuntimeException(String.format("Failed to convert object (%1$s) to JSON", obj), e);
}
}
#Before
public void setup() {
amazon = put(createOrder("Amazon Order"));
bestBuy = put(createOrder("BestBuy Order"));
target = put(createOrder("Target Order"));
walmart = put(createOrder("Wal-Mart Order"));
}
#Test
public void regionGet() {
assertThat((Order) orders.get(amazon.getTransactionId()), is(equalTo(amazon)));
}
#Test
public void repositoryFindOneMethod() {
log(orderRepository.findOne(target.getTransactionId()));
assertThat(orderRepository.findOne(target.getTransactionId()), is(equalTo(target)));
}
#Test
public void repositoryQueryMethod() {
assertThat(orderRepository.findByTransactionId(amazon.getTransactionId()), is(equalTo(amazon)));
assertThat(orderRepository.findByTransactionId(bestBuy.getTransactionId()), is(equalTo(bestBuy)));
assertThat(orderRepository.findByTransactionId(target.getTransactionId()), is(equalTo(target)));
assertThat(orderRepository.findByTransactionId(walmart.getTransactionId()), is(equalTo(walmart)));
}
#Region("Orders")
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public static class Order implements PdxSerializable {
protected static final OrderPdxSerializer pdxSerializer = new OrderPdxSerializer();
#Id
private Long transactionId;
private String name;
public Order() {
}
public Order(Long transactionId) {
this.transactionId = transactionId;
}
public Order(Long transactionId, String name) {
this.transactionId = transactionId;
this.name = name;
}
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public Long getTransactionId() {
return transactionId;
}
public void setTransactionId(final Long transactionId) {
this.transactionId = transactionId;
}
#Override
public void fromData(PdxReader reader) {
Order order = (Order) pdxSerializer.fromData(Order.class, reader);
if (order != null) {
this.transactionId = order.getTransactionId();
this.name = order.getName();
}
}
#Override
public void toData(PdxWriter writer) {
pdxSerializer.toData(this, writer);
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Order)) {
return false;
}
Order that = (Order) obj;
return ObjectUtils.nullSafeEquals(this.getTransactionId(), that.getTransactionId());
}
#Override
public int hashCode() {
int hashValue = 17;
hashValue = 37 * hashValue + ObjectUtils.nullSafeHashCode(getTransactionId());
return hashValue;
}
#Override
public String toString() {
return String.format("{ #type = %1$s, id = %2$d, name = %3$s }",
getClass().getName(), getTransactionId(), getName());
}
}
public static class OrderPdxSerializer implements PdxSerializer {
#Override
public Object fromData(Class<?> type, PdxReader in) {
if (Order.class.equals(type)) {
return new Order(in.readLong("transactionId"), in.readString("name"));
}
return null;
}
#Override
public boolean toData(Object obj, PdxWriter out) {
if (obj instanceof Order) {
Order order = (Order) obj;
out.writeLong("transactionId", order.getTransactionId());
out.writeString("name", order.getName());
return true;
}
return false;
}
}
public interface OrderRepository extends GemfireRepository<Order, Long> {
Order findByTransactionId(Long transactionId);
}
#Configuration
protected static class GemFireConfiguration {
#Bean
public Properties gemfireProperties() {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", JsonToPdxToObjectDataAccessIntegrationTest.class.getSimpleName());
gemfireProperties.setProperty("mcast-port", "0");
gemfireProperties.setProperty("log-level", "warning");
return gemfireProperties;
}
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
//cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxSerializer(new OrderPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
#Bean(name = "Orders")
public PartitionedRegionFactoryBean ordersRegion(Cache gemfireCache) {
PartitionedRegionFactoryBean regionFactoryBean = new PartitionedRegionFactoryBean();
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setName("Orders");
regionFactoryBean.setPersistent(false);
return regionFactoryBean;
}
#Bean
public GemfireRepositoryFactoryBean orderRepository() {
GemfireRepositoryFactoryBean<OrderRepository, Order, Long> repositoryFactoryBean =
new GemfireRepositoryFactoryBean<>();
repositoryFactoryBean.setRepositoryInterface(OrderRepository.class);
return repositoryFactoryBean;
}
}
}
So, as you are aware, GemFire (and by extension, Apache Geode) stores JSON in PDX format (as a PdxInstance). This is so GemFire can interoperate with many different language-based clients (native C++/C#, web-oriented (JavaScript, Pyhton, Ruby, etc) using the Developer REST API, in addition to Java) and also to be able to use OQL to query the JSON data.
After a bit of experimentation, I am surprised GemFire is not behaving as I would expect. I created an example, self-contained test class (i.e. no Spring XD, of course) that simulates your use case... essentially storing JSON data in GemFire as PDX and then attempting to read the data back out as the Order application domain object type using the Repository abstraction, logical enough.
Given the use of the Repository abstraction and implementation from Spring Data GemFire, the infrastructure will attempt to access the application domain object based on the Repository generic type parameter (in this case "Order" from the "OrderRepository" definition).
However, the data is stored in PDX, so now what?
No matter, Spring Data GemFire provides the MappingPdxSerializer class to convert PDX instances back to application domain objects using the same "mapping meta-data" that the Repository infrastructure uses. Cool, so I plug that in...
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
You will also notice, I set the PDX 'read-serialized' property (cacheFactoryBean.setPdxReadSerialized(false);) to false in order to ensure data access operations return the domain object and not the PDX instance.
However, this had no affect on the query method. In fact, it had no affect on the following operations either...
orderRepository.findOne(amazonOrder.getTransactionId());
ordersRegion.get(amazonOrder.getTransactionId());
Both calls returned a PdxInstance. Note, the implementation of OrderRepository.findOne(..) is based on SimpleGemfireRepository.findOne(key), which uses GemfireTemplate.get(key), which just performs Region.get(key), and so is effectively the same as (ordersRegion.get(amazonOrder.getTransactionId();). The outcome should not be, especially with Region.get() and read-serialized set to false.
With the OQL query (SELECT * FROM /Orders WHERE transactionId = $1) generated from the findByTransactionId(String id), the Repository infrastructure has a bit less control over what the GemFire query engine will return based on what the caller (OrderRepository) expects (based on the generic type parameter), so running OQL statements could potentially behave differently than direct Region access using get.
Next, I went onto try modifying the Order type to implement PdxSerializable, to handle the conversion during data access operations (direct Region access with get, OQL, or otherwise). This had no affect.
So, I tried to implement a custom PdxSerializer for Order objects. This had no affect either.
The only thing I can conclude at this point is something is getting lost in translation between Order -> JSON -> PDX and then from PDX -> Order. Seemingly, GemFire needs additional type meta-data required by PDX (something like #JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type") in the JSON data that PDXFormatter recognizes, though I am not certain it does.
Note, in my test class, I used Jackson's ObjectMapper to serialize the Order to JSON and then GemFire's JSONFormatter to serialize the JSON to PDX, which I suspect Spring XD is doing similarly under-the-hood. In fact, Spring XD uses Spring Data GemFire and is most likely using the JSON Region Auto Proxy support. That is exactly what SDG's JSONRegionAdvice object does (see here).
Anyway, I have an inquiry out to the rest of the GemFire engineering team. There are also things that could be done in Spring Data GemFire to ensure the PDX data is converted, such as making use of the MappingPdxSerializer directly to convert the data automatically on behalf of the caller if the data is indeed of type PdxInstance. Similar to how JSON Region Auto Proxying works, you could write AOP interceptor for the Orders Region to automagicaly convert PDX to an Order.
Though, I don't think any of this should be necessary as GemFire should be doing the right thing in this case. Sorry I don't have a better answer right now. Let's see what I find out.
Cheers and stay tuned!
See subsequent post for test code.

Resources