How to correctly parse Joda DateTime with SuperCSV? - supercsv

I'm attempting to parse a date from CSV like this:
2016-03-01
To a Joda DateTime with SuperCSV Dozer, like this:
private static final String[] FIELD_MAPPING = new String[] {"date"};
final CellProcessor[] processors = new CellProcessor[] {
new ParseDateTime(DateTimeFormat.forPattern("YYYY-MM-DD"))
};
CsvDozerBeanReader beanReader = new CsvDozerBeanReader(
new FileReader("/path/to.csv"), CsvPreference.STANDARD_PREFERENCE);
beanReader.configureBeanMapping(MyDateHoldingBean.class, FIELD_MAPPING);
MyDateHoldingBean bean = beanReader.read(EmployeeDetails.class, processors)
The DateTime returned is the current date & time, not a representation of the date read from CSV.
Am I doing it wrong?

You're missing a step; you need to configure the Dozer mapping. Currently this must be done with a DozerBeanMapper:
final CellProcessor[] processors = new CellProcessor[] {
new ParseDateTime(DateTimeFormat.forPattern("yyyy-MM-dd"))
};
DozerBeanMapper mapper = new DozerBeanMapper();
mapper.addMapping(new FileInputStream("/path/to/dozer.xml"));
CsvDozerBeanReader beanReader = new CsvDozerBeanReader(new FileReader("/path/to.csv"),
CsvPreference.STANDARD_PREFERENCE, mapper);
MyDateHoldingBean bean = beanReader.read(MyDateHoldingBean.class, processors)
where dozer.xml looks like:
<?xml version="1.0" encoding="UTF-8"?>
<mappings xmlns="http://dozer.sourceforge.net"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://dozer.sourceforge.net
http://dozer.sourceforge.net/schema/beanmapping.xsd">
<mapping>
<class-a>org.supercsv.io.dozer.CsvDozerBeanData</class-a>
<class-b>your.package.MyDateHoldingBean</class-b>
<field copy-by-reference="true">
<a>columns[0]</a>
<b>date</b>
</field>
</mapping>
</mappings>
See this question

Related

How to add file in Solr?

I use Apache Solr so that I can work with files, I can add regular text fields via Spring, but I don’t know how to add TXT / pdf
#SolrDocument(solrCoreName = "accounting")
public class Accounting {
#Id
#Field
private String id;
#Field
private File txtFile;
#Field
private String docType;
#Field
private String docTitle;
public Accounting() {
}
public Accounting(String id, String docType, String docTitle) {
this.id = id;
this.docTitle = docTitle;
this.docType = docType;
}
here is the problem with the txtFile field
<field name="docTitle" type="strings"/>
<field name="docType" type="strings"/>
These fields that I manually added to schema.xml, I can not figure out how to add a field here that will be responsible for the file, for example, I will add here a txt file, how to do it? Thank you very much. And do I correctly declare the field private File txtFile; in the entity for the file?
Solr will not store the actual file anywhere. Depending on your config it can store the binary content though. Using the extract request handler Apache Solr which relies on Apache Tika to extract the content from the document.
You can try something like below code. The current code is not using anything from the springboot. Here the content is read from the pdf document and then the data is indexed into solr along with id and filename. I have used the tika apis to extract the content of the pdf.
public static void main(final String[] args) throws IOException, TikaException, SAXException {
String urlString = "http://localhost:8983/solr/TestCore1";
SolrClient solr = new HttpSolrClient.Builder(urlString).build();
BodyContentHandler handler = new BodyContentHandler();
Metadata metadata = new Metadata();
File file = new File("C://Users//abhijitb//Desktop//TestDocument.pdf");
FileInputStream inputstream = new FileInputStream(file);
ParseContext pcontext = new ParseContext();
// parsing the document using PDF parser
PDFParser pdfparser = new PDFParser();
pdfparser.parse(inputstream, handler, metadata, pcontext);
// getting the content of the document
//System.out.println("Contents of the PDF :" + handler.toString());
try {
String fileName = file.getName();
SolrInputDocument document = new SolrInputDocument();
document.addField("id", "123456");
document.addField("title", fileName);
document.addField("text", handler.toString());
solr.add(document);
solr.commit();
} catch (SolrServerException | IOException e) {
e.printStackTrace();
}
}
Once you index the data, it can be verified on the solr admin page by querying for it.
Please find the image for your reference.

Ehcache jsr107:defaults not applying to programmatically created caches

Based on my findings in my previous SO question, I'm trying to setup JCaches in a mix of declarative and imperative configuration, to limit the max size of caches declaratively.
I keep a list of the caches and the duration (TTL) for their entries in my application.yaml, which I get with a property loader. I then create my caches with the code below:
#Bean
public List<javax.cache.Cache<Object, Object>> getCaches() {
javax.cache.CacheManager cacheManager = this.getCacheManager();
List<Cache<Object, Object>> caches = new ArrayList();
Map<String, String> cacheconfigs = //I populate this with a list of cache names and durations;
Set<String> keySet = cacheconfigs.keySet();
Iterator i$ = keySet.iterator();
while(i$.hasNext()) {
String key = (String)i$.next();
String durationMinutes = (String)cacheconfigs.get(key);
caches.add((new GenericDefaultCacheConfigurator.GenericDefaultCacheConfig(key, new Duration(TimeUnit.MINUTES, Long.valueOf(durationMinutes)))).getCache(cacheManager));
}
return caches;
}
#Bean
public CacheManager getCacheManager() {
return Caching.getCachingProvider().getCacheManager();
}
private class GenericDefaultCacheConfig {
public GenericDefaultCacheConfig(String cacheName, Duration duration) {
public GenericDefaultCacheConfig(String id, Duration duration, Factory expiryPolicyFactory) {
CACHE_ID = id;
DURATION = duration;
EXPIRY_POLICY = expiryPolicyFactory;
}
private MutableConfiguration<Object, Object> getCacheConfiguration() {
return new MutableConfiguration<Object, Object>()
.setTypes(Object.class, Object.class)
.setStoreByValue(true)
.setExpiryPolicyFactory(EXPIRY_POLICY);
}
public Cache<Object, Object> getCache(CacheManager cacheManager) {
CacheManager cm = cacheManager;
Cache<K, V> cache = cm.getCache(CACHE_ID, Object.class, Object.class);
if (cache == null)
cache = cm.createCache(CACHE_ID, getCacheConfiguration());
return cache;
}
}
I try limiting the cache size with the following ehcache.xml:
<config
xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'
xmlns='http://www.ehcache.org/v3'
xmlns:jsr107='http://www.ehcache.org/v3/jsr107'
xsi:schemaLocation="
http://www.ehcache.org/v3 http://www.ehcache.org/schema/ehcache-core-3.0.xsd
http://www.ehcache.org/v3/jsr107 http://www.ehcache.org/schema/ehcache-107-ext-3.0.xsd">
<service>
<jsr107:defaults default-template="heap-cache" enable-management="true" enable-statistics="true">
</jsr107:defaults>
</service>
<cache-template name="heap-cache">
<resources>
<heap unit="entries">20</heap>
</resources>
</cache-template> </config>
I set the following declaration in my application.yaml:
spring:
cache:
jcache:
config: classpath:ehcache.xml
However, my caches don't honor the imposed limit. I validate with the following test:
#Test
public void testGetCacheMaxSize() {
Cache<Object, Object> cache = getCache(MY_CACHE); //I get a cache of type Eh107Cache[myCache]
CacheRuntimeConfiguration<Object, Object> ehcacheConfig = (CacheRuntimeConfiguration<Object, Object>)cache.getConfiguration(
Eh107Configuration.class).unwrap(CacheRuntimeConfiguration.class);
long size = ehcacheConfig.getResourcePools().getPoolForResource(ResourceType.Core.HEAP).getSize(); //Returns 9223372036854775807 instead of the expected 20
for(int i=0; i<30; i++)
commonDataService.getAllStates("ENTRY_"+i);
Map<Object, Object> cachedElements = cacheManagerService.getCachedElements(MY_CACHE);
assertTrue(cachedElements.size().equals(20)); //size() returns 30
}
Can somebody point out what I am doing wrong? Thanks in advance.
The issue comes from getting the cache manager as:
Caching.getCachingProvider().getCacheManager();
By setting the config file URI on cache manager's initialization I got it to work:
cachingProvider = Caching.getCachingProvider();
configFileURI = resourceLoader.getResource(configFilePath).getURI();
cacheManager = cachingProvider.getCacheManager(configFileURI, cachingProvider.getDefaultClassLoader());
I was under the expectation that Spring Boot would automatically create the cache manager based on the configuration file included given in property spring.cache.jcache.config,
but that was not the case because I get the cache manager as described above instead of simply auto-wiring it and letting Spring create it.

Deserializing DynamoDBResults with gson fails

I have a specific use case where I store the results from my one table in DynamoDB to be stored in a serialized manner in another DynamoDB.
Now when I use gson to deserialize the data being retrieved,
I get this error:
java.lang.RuntimeException: Unable to invoke no-args constructor for class java.nio.ByteBuffer. Register an InstanceCreator with Gson for this type may fix this problem.
at com.google.gson.internal.ConstructorConstructor$12.construct(ConstructorConstructor.java:210)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:186)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:103)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:196)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:40)
at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:187)
at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:145)
at com.google.gson.Gson.fromJson(Gson.java:810)
at com.google.gson.Gson.fromJson(Gson.java:775)
My method looks like this:
public void store(MyCustomObject obj) {
String primaryKey = obj.getKey();
List<Map<String, AttributeValue>> results = AmazonDynamoDB.query(...).getItems();
Gson gson = new Gson();
List<String>records = results .stream()
.map(mappedResult-> gson.toJson(mappedResult))
.collect(Collectors.toList());
Map<String, AttributeValue> attributeMap = transformToAttributeMap(records);
PutItemRequest putItemRequest = new PutItemRequest().withItem(attributeMap);
AmazonDynamoDB.putItem(...);
}
The method to retrieve the records looks something like this:
public void retrieve(String id) {
QueryRequest...
Map<String, AttributeValue> records = DynamoDB.query(...).getItems();
List<String> serializedRecords = new ArrayList<>();
List<AttributeValue> values = records.get("key");
for( AttributeValue attributeValue: values) {
serializedRecords.add(attributeValue.getS());
}
Gson gson = new Gson();
Type recordType = new TypeToken<Map<String, AttributeValue>>() { }.getType();
List<Map<String, AttributeValue>> actualRecords = serializedRecords.stream()
.map(record-> gson.fromJson(record, recordType))
.collect(Collectors.toList());
}
What am I doing wrong?
The problem is AttributeValue class has a field java.nio.ByteBuffer with name b. Gson tries to deserialize the data into it, but there is no default constructor for ByteBuffer class. Therefore gson cannot deserialize b field.
An alternative solution is with the new DynamoDB usage of AWS SDK. Following example should work:
AmazonDynamoDBClient client = new AmazonDynamoDBClient(
new ProfileCredentialsProvider());
Item item = new DynamoDB(client).getTable("user").getItem("Id", "user1");
String json = item.toJSON();
Item deserialized = Item.fromJSON(json);
You should modify the credentials provider according to your setup.
Not exactly the best workaround/answer, but I was able to do this:
Item item = new Item().withJSON("document", jsonStr);
Map<String,AttributeValue> attributes = InternalUtils.toAttributeValues(item);
return attributes.get("document").getM();

How to Convert gson to LinkedHashMap<String, List<String>>?

i'm new to gson and i wonder how convert json data to LinkedHashMap<String, List<String>>
my json data is show like below:
{ "data":
{
"data1": ["asdf", "qwer"],
"data2": ["xczv", "aweqrfds123", "sfdgq234"],
"data3": ["dsafasd", "xcvr123", "sdfa324123"]
}
}
field names of json data of data are dynamic, so i want to convert json data of data to LinkedHashMap<String, List<String>>
how can i do that ?
You can use TypeToken to convert it into expected type with Gson#fromJson(Reader,Type)
As per JSON string it is LinkedHashMap<String,LinkedHashMap<String,ArrayList<String>>>
Sample code:
BufferedReader reader = new BufferedReader(new FileReader(new File("json.txt")));
Type type = new TypeToken<LinkedHashMap<String,LinkedHashMap<String,ArrayList<String>>>>() {}.getType();
LinkedHashMap<String,LinkedHashMap<String,ArrayList<String>>> data = new Gson().fromJson(reader, type);
LinkedHashMap<String,ArrayList<String>> innerMap = data.get("data");
System.out.println(new GsonBuilder().setPrettyPrinting().create().toJson(innerMap));
This is not how it works in Gson world - you can't convert JSON to any Java class you want, unless you want to do all of that manually. The common approach works as described below:
Create a Java class, which matches your JSON format, e.g. you can use a Java class generator described here: http://jsongen.byingtondesign.com/
Use GsonBuilder to read your Json from a file and to import it to the generated class
I've used that approach and the Java file that has been generated (after I've fixed a minor syntax error in your initial JSON) looks like this:
package com.json;
import java.util.List;
public class Data{
private List data1;
private List data2;
private List data3;
public List getData1(){
return this.data1;
}
public void setData1(List data1){
this.data1 = data1;
}
public List getData2(){
return this.data2;
}
public void setData2(List data2){
this.data2 = data2;
}
public List getData3(){
return this.data3;
}
public void setData3(List data3){
this.data3 = data3;
}
}
To start working with the newly created class you can use the template below:
is = new InputStreamReader(new FileInputStream(new File('<path-to-json>')), "UTF-8")/;
Gson gson = new GsonBuilder().create();
Data d = gson.fromJson(is, Data.class);
// Start using your d instance here

EclipseLink Moxy unmarshall and getValueByXPath gives null

I use the below code to get unmarshall and query the unmarshelled object by Xpath.
I am able to get the object after unmarshalling, but while querying by XPath, the value is coming as null.
Do I need to specify any NameSpaceResolver?
Please let me know if you are looking for any further information.
My code:
JAXBContext jaxbContext = (JAXBContext) JAXBContextFactory.createContext(new Class[] {Transaction.class}, null);
Unmarshaller unmarshaller = jaxbContext.createUnmarshaller();
StreamSource streamSource= new StreamSource(new StringReader(transactionXML));
transaction = unmarshaller.unmarshal(streamSource, Transaction.class).getValue();
String displayValue = jaxbContext.getValueByXPath(transaction, xPath, null, String.class);
My XML:
<Transaction xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" >
<SendingCustomer firstName="test">
</SendingCustomer>
</Transaction>
Since there are no namespaces in your example you do not need to worry about leveraging a NamespaceResolver. You didn't provide the XPath that you were having trouble with, so I have just picked one in the example below.
JAVA MODEL
Transaction
import javax.xml.bind.annotation.*;
#XmlRootElement(name="Transaction")
public class Transaction {
#XmlElement(name="SendingCustomer")
private Customer sendingCustomer;
}
Customer
import javax.xml.bind.annotation.XmlAttribute;
public class Customer {
#XmlAttribute
private String firstName;
#XmlAttribute
private String lastNameDecrypted;
#XmlAttribute(name="OnWUTrustList")
private boolean onWUTrustList;
#XmlAttribute(name="WUTrustListType")
private String wuTrustListType;
}
DEMO CODE
input.xml
<Transaction xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<SendingCustomer firstName="test" lastNameDecrypted="SMITH"
OnWUTrustList="false" WUTrustListType="NONE">
</SendingCustomer>
</Transaction>
Demo
import javax.xml.bind.Unmarshaller;
import javax.xml.transform.stream.StreamSource;
import org.eclipse.persistence.jaxb.JAXBContext;
import org.eclipse.persistence.jaxb.JAXBContextFactory;
public class Demo {
public static void main(String[] args) throws Exception {
JAXBContext jaxbContext = (JAXBContext) JAXBContextFactory.createContext(new Class[] {Transaction.class}, null);
Unmarshaller unmarshaller = jaxbContext.createUnmarshaller();
StreamSource streamSource= new StreamSource("src/forum17687460/input.xml");
Transaction transaction = unmarshaller.unmarshal(streamSource, Transaction.class).getValue();
String displayValue = jaxbContext.getValueByXPath(transaction, "SendingCustomer/#firstName", null, String.class);
System.out.println(displayValue);
}
}
Output
test

Resources