Elastic search spring boot ZonedDateTime converter not mapping correctly - spring

I know a similar question has been asked before like this one but I think my problem is a bit different. I have an job board app that uses MongoDBas it's primary database, when data enters mongo is it synced to my Elasticsearch cluster automatically, all my date objects are ZonedDateTime due to the fact that even MongoDB does not have a converter for ZonedDateTime, I use this as my converter, I stored the datefield as an object with a dateTime, zoneId and offset, below is how it looks in the database
And the code responsible for that conversion is as follows:
My Writing converter
#WritingConverter
public class ZonedDateTimeToMongoDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, zonedDateTime.toInstant().getEpochSecond() * 1_000);
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
And my reading converter
#ReadingConverter
public class MongoDocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
And my Codec
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
This code works great issue though is when the data enters Elasticsearch, because of the nature of my date fields I had to opt for the datefields to be stored as objects as well and the mapping
{
"myDateField": {
"type": "object",
"properties": {
"dateTime": {
"type": "date"
},
"zone": {
"type": "keyword"
},
"offSet": {
"type": "keyword"
}
}
}
}
PS: Am new to Elasticsearch, this is in fact my first attempt at it.
Then I create my spring ZonedDateTime elastic converters as follows
Writing converter
#ReadingConverter
#RequiredArgsConstructor
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public ZonedDateTime convert(String source) {
ZoneDateTime zoneDateTime = objectMapper.readValue(source, ZoneDateTime.class);
return ZonedDateTime.ofInstant(zoneDateTime.getDateTime().toInstant(), ZoneId.of(zoneDateTime.getZone()));
}
}
Reading converter
#WritingConverter
#RequiredArgsConstructor
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public String convert(ZonedDateTime zonedDateTime) {
return objectMapper.writeValueAsString(ZoneDateTime.builder()
.dateTime(Date.from(zonedDateTime.toInstant()))
.zone(zonedDateTime.getZone().getId())
.offSet(zonedDateTime.getOffset().toString())
.build());
}
}
And my ZoneDateTime class looks like
#SuperBuilder(toBuilder = true)
#RequiredArgsConstructor
#Getter
public class ZoneDateTime {
private final Date dateTime;
private final String offSet;
private final String zone;
}
From my understanding this should work, but spring gives me this error
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [java.time.ZonedDateTime] for value '2020-08-08T14:32:22.094Z'; nested exception is com.fasterxml.jackson.core.JsonParseException: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: (String)"2020-08-08T14:32:22.094Z"; line: 1, column: 6]
And really not sure why Elasticsearch only sends the dateTime field from the dateField object leaving the zone and offset, elasticsearch ignores the fact that my datefield is an object. How can I fix this ?

Related

how to format and store date in ElasticSearch

I am trying to store date value in ElasticSearch. BElow is my code
pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
Model class
#Document(indexName="employee", createIndex=true, shards = 4)
public class Employee {
#Nullable
#Field(type = FieldType.Date, pattern = "yyyy-MM-dd", format = DateFormat.date)
private LocalDate joinedDate;
}
ElasticSearch index properties
"mappings": {
"employee": {
"properties": {
"joinedDate": {
"format": "date",
"type": "date"
}
My Configuration file
#Configuration
#EnableElasticsearchRepositories("com.sample.dao")
public class ElasticSearchClientBuilder extends AbstractElasticsearchConfiguration{
Logger logger = LoggerFactory.getLogger(ElasticSearchClientBuilder.class);
#Override
#Bean
public RestHighLevelClient elasticsearchClient() {
//Configuration for ResthighClient
}
}
Error i am getting for above setting
Caused by: org.elasticsearch.client.ResponseException: method [PUT], host [https://ausdlcceesdb01.us.dell.com:9200], URI [/employee/employee/a77055df-2a79-4d8d-8911-315003bfed28?timeout=1m], status line [HTTP/1.1 400 Bad Request]
Warnings: [299 Elasticsearch-7.6.2-ef48eb35cf30adf4db14086e8aabd07ef6fb113f "[types removal] Specifying types in document index requests is deprecated, use the typeless endpoints instead (/{index}/_doc/{id}, /{index}/_doc, or /{index}/_create/{id})."]
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse field [joinedDate] of type [date] in document with id 'a77055df-2a79-4d8d-8911-315003bfed28'. Preview of field's value: '{dayOfWeek=THURSDAY, month=JANUARY, year=2022, dayOfMonth=6, era=CE, dayOfYear=6, monthValue=1, chronology={calendarType=iso8601, id=ISO}, leapYear=false}'"}],
"type":"mapper_parsing_exception","reason":"failed to parse field [joinedDate] of type [date] in document with id 'a77055df-2a79-4d8d-8911-315003bfed28'. Preview of field's value: '{dayOfWeek=THURSDAY, month=JANUARY, year=2022, dayOfMonth=6, era=CE, dayOfYear=6, monthValue=1, chronology={calendarType=iso8601, id=ISO}, leapYear=false}'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:83"}},
"status":400}
Without pattern in model my date is stores as below columns in index
joinedDate.year
joinedDate.month
joinedDate.dayOfMonth
joinedDate.dayOfWeek
joinedDate.era
joinedDate.dayOfYear
joinedDate.monthValue
joinedDate.chronology
joinedDate.leapYear
Please help how to store yyyy-MM-dd in index
I think you will need something like this:
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSS")
#JsonSerialize(using = CustomLocalDateTimeSerializer.class)
#JsonDeserialize(using = CustomLocalDateTimeDeserializer.class)
private LocalDateTime createDate;
CustomLocalDateTimeSerializer class:
public class CustomLocalDateTimeSerializer extends StdSerializer<LocalDateTime> {
public CustomLocalDateTimeSerializer() {
this(null);
}
private CustomLocalDateTimeSerializer(Class<LocalDateTime> t) {
super(t);
}
#Override
public void serialize(LocalDateTime value, JsonGenerator gen, SerializerProvider provider) throws IOException {
gen.writeString(value.format(DateTimeFormatter.ofPattern(Constants.DATE_FORMAT_SIMPLE)));
}
}
CustomLocalDateTimeDeserializer class:
public class CustomLocalDateTimeDeserializer extends StdDeserializer<LocalDateTime> {
public CustomLocalDateTimeDeserializer() {
this(null);
}
private CustomLocalDateTimeDeserializer(Class<LocalDateTime> t) {
super(t);
}
#Override
public LocalDateTime deserialize(JsonParser jsonParser, DeserializationContext ctxt) throws IOException, JsonProcessingException {
String date = jsonParser.getText();
try {
return LocalDateTime.parse(date);
} catch (Exception ex) {
log.debug("Error while parsing date: {} ", date, ex);
throw new RuntimeException("Cannot Parse Date");
}
}
}

How do I parse snake case fields in a FeignClient response json?

I have configured a FeignClient in my spring boot webapp where I'm calling an external api that returns the following object.
public class Issue {
private Assignee assignee;
private Date createdAt;
private Date updatedAt;
private Date closedAt;
private String description;
private Date dueDate;
public Assignee getAssignee() {
return assignee;
}
public void setAssignee(Assignee assignee) {
this.assignee = assignee;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Date getDueDate() {
return dueDate;
}
public void setDueDate(Date dueDate) {
this.dueDate = dueDate;
}
public Date getUpdatedAt() {
return updatedAt;
}
public void setUpdatedAt(Date updatedAt) {
this.updatedAt = updatedAt;
}
public Date getClosedAt() {
return closedAt;
}
public void setClosedAt(Date closedAt) {
this.closedAt = closedAt;
}
#Override
public String toString() {
return (JacksonJson.toJsonString(this));
}
}
The fields updatedAt, createdAt and closedAt are all in snake case. All multi-word fields show up as null. Is there any way of configuring the FeignClient's Jackson parser so that it can process snake case characters? Note, that I cannot change the default Jackson Parser for my spring boot webapp because I myself render json in camel case. I just need to configure this parser on the FeignClient that I'm using to connect to an external REST api.
I have verified that the json response returned from the api call contains valid values in each of these json fields.
Here's how I solved it. I created a custom JacksonParser as a Spring Bean.
#Configuration(proxyBeanMethods = false)
public class FeignClientDateFormatConfig {
#Bean
public Decoder feignDecoder() {
HttpMessageConverter jacksonConverter = new MappingJackson2HttpMessageConverter(customObjectMapper());
ObjectFactory<HttpMessageConverters> objectFactory = () -> new HttpMessageConverters(jacksonConverter);
return new ResponseEntityDecoder(new SpringDecoder(objectFactory));
}
public ObjectMapper customObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setPropertyNamingStrategy(PropertyNamingStrategy.SNAKE_CASE);
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
return objectMapper;
}
}
This successfully parses all snake case properties.
Please note that this has a severe limitation. If you have multiple FeignClients and only one of them returns snake-case json, then you're out of luck. This overrides the default FeignClient config. The only workaround possible with this solution is to move your FeignClient calls into a separate microservice so other FeignClient calls are not affected.

Problem with nomad-sdk DateTime deserialization (v 0.11.3.0)

Found a problem with a datetime de-serialization (nomad-sdk version 0.11.3.0).
Server(agent) version: Nomad v1.0.1 (c9c68aa55a7275f22d2338f2df53e67ebfcb9238)
When I try to get an allocation list from the nomad agent (via API) I get the following error:
Complete stack trace could be found here: https://pastebin.pl/view/9bf82a78
Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Can not deserialize value of type java.util.Date from String "2020-12-17T11:58:59.346780177+01:00": not a valid representation (error
: Failed to parse Date value '2020-12-17T11:58:59.346780177+01:00': Can not parse date "2020-12-17T11:58:59.346780177+0100": while it seems to fit format 'yyyy-MM-dd'T'HH:mm:ss.SSSZ', parsing fails (leni
ency? null))
Suggested/Tested workaround:
package com.hashicorp.nomad.apimodel;
import org.joda.time.format.DateTimeFormatter;
import org.joda.time.format.ISODateTimeFormat;
public class CustomDateDeserializer extends StdDeserializer<Date> {
public CustomDateDeserializer() {
super(Date.class);
}
#Override
public Date deserialize(com.fasterxml.jackson.core.JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
final String date = p.getText();
if (date.equals("0001-01-01T00:00:00Z")) {
return new Date();
}
DateTimeFormatter isoDateTimeFormat = ISODateTimeFormat.dateTime();
return isoDateTimeFormat.parseDateTime(date).toDate();
}
}
public abstract class NomadJson {
static {
OBJECT_MAPPER.setConfig(
OBJECT_MAPPER.getSerializationConfig()
.with(new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"))
);
SimpleModule simpleModule = new SimpleModule();
simpleModule.addDeserializer(Date.class, new CustomDateDeserializer());
OBJECT_MAPPER.registerModule(simpleModule);
}
}
//Added "CustomDateDeserializer" to AllocDeploymentStatus.java
public final class AllocDeploymentStatus extends ApiObject {
#JsonProperty("Timestamp")
#JsonDeserialize(using = CustomDateDeserializer.class)
public Date getTimestamp() {
return timestamp;
}
}

CodecConfigurationException when saving ZonedDateTime to MongoDB with Spring Boot >= 2.0.1.RELEASE

I was able to reproduce my problem with a minimal modification of the official Spring Boot guide for Accessing Data with MongoDB, see https://github.com/thokrae/spring-data-mongo-zoneddatetime.
After adding a java.time.ZonedDateTime field to the Customer class, running the example code from the guide fails with a CodecConfigurationException:
Customer.java:
public String lastName;
public ZonedDateTime created;
public Customer() {
output:
...
Caused by: org.bson.codecs.configuration.CodecConfigurationException`: Can't find a codec for class java.time.ZonedDateTime.
at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ChildCodecRegistry.get(ChildCodecRegistry.java:51) ~[bson-3.6.4.jar:na]
This can be solved by changing the Spring Boot version from 2.0.5.RELEASE to 2.0.1.RELEASE in the pom.xml:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.1.RELEASE</version>
</parent>
Now the exception is gone and the Customer objects including the ZonedDateTime fields are written to MongoDB.
I filed a bug (DATAMONGO-2106) with the spring-data-mongodb project but would understand if changing this behaviour is not wanted nor has a high priority.
What is the best workaround? When duckduckgoing for the exception message I find several approaches like registering a custom codec, a custom converter or using Jackson JSR 310. I would prefer to not add custom code to my project to handle a class from the java.time package.
Persisting date time types with time zones was never supported by Spring Data MongoDB, as stated by Oliver Drotbohm himself in DATAMONGO-2106.
These are the known workarounds:
Use a date time type without a time zone, e.g. java.time.Instant. (It is generally advisable to only use UTC in the backend, but I had to extend an existing code base which was following a different approach.)
Write a custom converter and register it by extending AbstractMongoConfiguration. See the branch converter in my test repository for a running example.
#Component
#WritingConverter
public class ZonedDateTimeToDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, Date.from(zonedDateTime.toInstant()));
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
#Component
#ReadingConverter
public class DocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
#Override
protected String getDatabaseName() {
return database;
}
#Bean
public CustomConversions customConversions() {
return new MongoCustomConversions(asList(
new ZonedDateTimeToDocumentConverter(),
new DocumentToZonedDateTimeConverter()
));
}
}
Write a custom codec. At least in theory. My codec test branch is unable to unmarshal the data when using Spring Boot 2.0.5 while working fine with Spring Boot 2.0.1.
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host + ":" + port, createOptions());
}
private MongoClientOptions createOptions() {
CodecProvider pojoCodecProvider = PojoCodecProvider.builder()
.automatic(true)
.build();
CodecRegistry registry = CodecRegistries.fromRegistries(
createCustomCodecRegistry(),
MongoClient.getDefaultCodecRegistry(),
CodecRegistries.fromProviders(pojoCodecProvider)
);
return MongoClientOptions.builder()
.codecRegistry(registry)
.build();
}
private CodecRegistry createCustomCodecRegistry() {
return CodecRegistries.fromCodecs(
new ZonedDateTimeCodec()
);
}
#Override
protected String getDatabaseName() {
return database;
}
}

Query MongoDb based on Map Key Spring Repository

I need help to query nested documents. Using Spring Boot with MongoDB.
Structure:
public class Holiday {
#Id
private String id;
private Integer year;
private Map<String, List<HolidayElement>> holidays = new HashMap<>();
}
public class HolidayElement {
private String name;
#JsonFormat(pattern="yyyy-MM-dd")
private Date date;
private String note;
}
After saving everything the Json looks like:
[
{
"id": "5a153331b3cb1f0001e1edeb",
"year": 2017,
"holidays": {
"BB": [
{
"name": "Neujahrstag",
"date": "2017-01-01",
"note": ""
},
...
],
"HH": [
{ ... }
]
}
]
Now how can I get for instance: List of "HolidayElement" where the State is "BB"?
Assuming you have a repository like HolidayRepository, you need to create a custom implementation since you want to use MongoTemplate. So your HolidayRepository will look like
#Repository
public interface HolidayRepository extends MongoRepository<Holiday, String>, HolidayRepositoryCustom {
}
And declare two new files HolidayRepositoryCustom and HolidayRepositoryImpl in the same directory(very important) as HolidayRepository
public interface HolidayRepositoryCustom {
List<HolidayElement> findByMapId(final String mapId);
}
And the Impl class will look like this
public class HolidayRepositoryImpl implements HolidayRepositoryCustom {
private final MongoTemplate mongoTemplate;
#Autowired
public HolidayRepositoryImpl(final MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Override
public List<HolidayElement> findByMapId(String mapId) {
final QueryBuilder queryBuilder = QueryBuilder.start();
queryBuilder
.and("holidays."+mapId).exists(true);
final DBObject projection = new BasicDBObject();
projection.put("holidays."+mapId, 1);
String collectionName = "Holiday";//Change to your collection name
try( final DBCursor dbCursor = mongoTemplate.getCollection(collectionName).find(queryBuilder.get(), projection)){
if(dbCursor.hasNext()){
DBObject next = dbCursor.next();
Map<String, List<HolidayElement>> holidayElements =
(Map<String, List<HolidayElement>>) next.get("holidays");
return holidayElements.get(mapId);
}
}
return Lists.newArrayList();
}
}

Resources