Is it possible to Serialize to XML using the same format of the XSD? - asp.net-mvc-3

I have generated a Class from an XSD using XSD2Code.
I now need to deserialize a conformant XML file for this XSD into an object of this class.
I have tried a number of XML Serializers, but they seem to use their own XML format, thus I am unable to externally edit a conformant XML file for deserializing into an object.
Is it possible to Deserialize into an object while maintaining the original format ie one can generate an XML file which is conformant to the XSD, and not the serializer's specific XML format.
Many thanks in advance.
Ed

Sorted. I have found XSD2Code does this.

Related

jackson - root element read Tree vs pojo

Hi I want to parse a json that i retrieve by hitting an legacy system, and build a response json. We are using Spring Boot having a jackson dependency. The problem i have is almost 75% of fields from legacy can be mapped directly or on basis of simple rules (0: false, 1:true). But, there are some complex rules as well like based on certain conditions and data present in some fields, we can map them to a nested object etc. To cater to this requirement which approach should we consider -
POJO approach to fetch the data from legacy target. Use bean util. copyproperties to populate the response bean (75% of properties), and then apply the business transformations on this POJO to tranform based on business logic. (Would we need two pojos here a. to copy from beanutil.copyproperties and then b. create final response dto ??)
Do not use pojo directly parse the JSON apply the transformations and then create a new POJO or response DTO. (But, this may not be generic solution and would need to be done on case by case basis).
Main considerations are approach should be fast, and generic to be applied like a framework. Thanks aakash
The considerations should be like below:
- Are the POJOs reusable?
- Is the JSON multilevel and very large?
If the answer is yes for both, then better to choose POJOs for cleaner implementation. Otherwise JsonObject parsing.
Hope this will help.

Can i access request parameter in jackson BeanSerializerModifier?

I am using Jersey to implement rest api and Jackson to provide JSON support. I am trying to remove certain properties before serialization by overriding BeanSerializerModifier.changeProperties method.
But removing properties will be based on query parameter. Is there any way to access the query parameter in my implementation?
Use of BeanSerializerModifier itself would get complicated as the method is only called once when construction necessarily JsonSerializer for the first time. As to passing query parameters, you could pass them using contextual attributes and ObjectWriter (constructed from ObjectMapper), but that means taking over quite a bit of serialization automation from Jersey.
There is one mechanism that could be helpful in modifying serialization aspects without taking over the whole process: registering ObjectWriterModifier, using ObjectWriterInjector. These are part of Jackson JAX-RS provider, added in Jackson 2.3. Without knowing more details I don't know how easy this would be; part of the issue is that query parameters are more of an input side things, so there is no direct access to them from output processing side.

Using a custom ObjectMapper for Spring XD Json to Java Conversion

Is there an easy way to convert a JSON payload to a Java object using a custom ObjectMapper (Jackson) or do I have to provide a custom type converter. I know that I could use a processor, but somehow it would be nice to use input and output types of the stream definition.
In the second case: Am I even able to provide a custom type converter for application/json to Java?
The documentation states: "The customMessageConverters are added after the standard converters in the order defined. So it is generally easier to add converters for new media types than to replace existing converters."
I bet that there is an existing "application/json" converter - but at a first glance I could not find further information if it is even possible to replace existing converters.
Thanks!
Peter
If you look at streams.xml You can see the relevant configuration. The configured lists are used to construct a CompositeMessageConverter which visits every MessageConverter in list order until it finds one that can do the conversion and returns a non-null result. A CompositeConverter instance is created for each module instance that is configured for conversion (i.e., defines an inputType or outputType value) by filtering the list of candidate message converters, which all inherit AbstractFromMessageConverter. The list is paired down to those which respond true to public boolean supportsTargetMimeType(MimeType mimeType) (where mimeType is the value of the input/outputType). The CompositeMessageConverter is injected into the corresponding MessageChannel and converts the payload.
There are a couple of things you can do. You can override the xd.messageConverters bean definition. For example, you can replace JsonToPojoMessageConverter and PojoToJsonMessageConverter with your own subclasses. You can also insert your own implementations in the list before the above converters and have your implementation match only specific domain objects for which you need a custom JSON mapper.
Another possibility is to define your own mime type and provide converters for that mime type as customMessageConverters. In any case, follow these guidelines forextending Spring XD

Hadoop - How to switch from implementing the writable interface to use an Avro object?

I’m using Hadoop to convert JSONs into CSV files to access them with Hive.
At the moment the Mapper is filling an own data structure parsing the JSONs with JSON-Smart. Then the reducer is reading out that object and is writing it to a file, separated by commas.
For making this faster I already implemented the writable interface in the data structure...
Now I want to use Avro for the data structure object to have more flexibility and performance. How could I change my classes to make them exchange an Avro object instead of a writable?
Hadoop offers a pluggable serialization mechanism via the SerializationFactory.
By default, Hadoop uses the WritableSerialization class to handle the deserialization of classes which implement the Writable interface, but you can register custom serializers that implement the Serialization interface by setting the Hadoop configuration property io.serializations (a CSV list of classes that implement the Serialization interface).
Avro has an implementation of the Serialization interface in the AvroSerialization class - so this would be the class you configure in the io.serializations property.
Avro actually has a whole bunch of helper classes which help you write Map / Reduce jobs to use Avro as input / output - there's some examples in the source (Git copy)
I can't seem to find any good documentation for Avro & Map Reduce at the moment, but i'm sure there are some other good examples out there.

Hadoop: How to save Map object in configuration

Any idea how can I set Map object into org.apache.hadoop.conf.Configuration?
Serialize your map into JSON and then put it as string in your configuration.
There is no way to put a whole object into it, because the whole configuration will be written as a XML file.
GSON is quite good at it: http://code.google.com/p/google-gson/
Here is the tutorial about how to serialize collections: http://sites.google.com/site/gson/gson-user-guide#TOC-Collections-Examples

Resources