Can the Jackson CsvMapper parse CSV with spaces in the header names into an object? - jackson-dataformat-csv

I'm trying to import a CSV file with spaces in the header row directly into a POJOs using CsvMapper.schemaFor(Class<>). Naturally the respective members cannot have spaces in the identifier. Is there a way to tell the mapper which columns to map on which member?

Just map fields in your POJO model aka
#JsonProperty("Property with spaces")
private String propertyWithSpaces;

Related

How to convert a single field during elasticsearch deserialization using NEST

I have a field in elastic that is sometimes a string and sometimes a string array. The field in my .NET model is a string array. During deserialization I would like to convert this always to a string array to match my model. Would I use IElasticsearchSerializer or is that for handling the entire source and not just a single field? Does anyone have any simple example I could try?

Mapreduce POJO mapping

i have a file in hdfs system which is the output of join of 3 tables related to sales data.
(sales header, item detail,tender detail).
The file will have columns from all the three tables combined.
If there are 3 items and 1 tender , i will have 6 rows for a transaction.
So there will be 6 lines in the file with same transaction number.
I can read this in mapper and create a DTO with all the fields
Now i want to construct the complex DTO structure out of this flattened DTO.
Is there any pojo mapping framework available for this and will it support maping from a plain DTO to a complex structure.
Structure
public class PlainDTO{
String tranId;
String processDate;
String itemNumber;
String itemName;
int tenderId;
.......
......
}
From List, i need to convert to
below structure
public class ComplexDTO{
private SlsHeader slsHeader;
private Collection<SlsItems> items;
private Collection<SlsTender> tenderDetails
}
Conversion from flat DTO to complex DTO is plain java stuff. Once you write it, and it stays like that. From complex DTO to json, you can use any JSON-Java parsers like Jackson or Gson.
The challenge could be, once you have this one-many mapping(DTO complex structure) in json, you should see how the Elastic search manages these relationships. I worked with Solr(similar to Elastic Search). They have child documents concept in Solr.
Also, at higher level, if your Elastic search client is java based, you can directly go from Flat structure to ES client, skipping json.

Map JSON string property to ES object

I have a process that imports some of the data from external sources to elasticsearch. I use C# and NEST client.
Some of the classes have string properties that contain JSON. Same property may contain different json schema depending on source.
I want to index and analyze json objects in these properties.
I tried object type mapping using [ElasticProperty(Type=FieldType.Object)] but it doesn't seem to help.
What is the right way to index and analyze these strings?
E.g. I import objects like below and then want to query all start events of customer 9876 that have status rejected. I then want to see how they distribute over period of time (using kibana).
var e = new Event (){id=123, source="test-log" input="{type:'START',params:[{name:'customerid',value:'9876'},{name:'region',value:'EU'}]}",result="{status:'rejected'}"};

is RestKit has default mapping if omit some field

my json and object both contain some fields with same name, and my mapping like this:
[mapping addAttributeMappingsFromArray :#[ #"postId", #"fieldname1", #"fieldname2", #"fieldname3", #"fieldname4", #"fieldname5"]];
if json was returned without fieldname4, it is fine, the mapping just ignore it.
but on the contrary, if I forget some field in mapping, for example.
[mapping addAttributeMappingsFromArray :#[ #"postId"]];
then the object will got nothing except postId even if json contains every field.
is there someway I can tell the mapping to do some "default mapping" if json and object contained same field name. So I need not list all the field names to the mapping even if their field names are all same.
No. You explicitly list the keys which should be processed.
You could create a dynamic mapping which introspects on the response and the destination object and creates a mapping including all keys. RestKit doesn't do that because it's slow...

Binding multiple select where option values may contain commas in Spring 3

We are having an issue with binding a multiple select element when the option values contain commas. We have tried binding to both a String and to a List<String>, but have issues with both.
When a multiple select element is posted, the value of each selected option is passed in a separate request parameter, all with the same name. For example, if the select element name is "code", the parameters might look like this:
code=ABC
code=A,B
code=XYZ
When binding to a String, Spring will automatically join these values into a comma-separated string. That is obviously an issue if one or more of the values contains a comma.
When binding to a List<String>, things work fine when multiple options are selected. In that case, Spring creates a List with an entry for each selected option. But if only one option is selected, Spring assumes the value is a comma-separated list and will split it into multiple entries.
Is there a way to tell Spring to use a different character than a comma when binding to a String? Is there a way to tell Spring not to split a single value when binding to a List<String>? Or is there another way to deal with this?
I believe this thread is related to your issue: How to prevent parameter binding from interpreting commas in Spring 3.0.5?. This Spring issue may also be helpful: https://jira.springsource.org/browse/SPR-7963
The solution provided at https://stackoverflow.com/a/5239841/1259928, which details how to create a new conversion service which uses a different string separator and wiring it into Spring config should do the trick.

Resources