how to format and store date in ElasticSearch - spring-boot

I am trying to store date value in ElasticSearch. BElow is my code
pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
Model class
#Document(indexName="employee", createIndex=true, shards = 4)
public class Employee {
#Nullable
#Field(type = FieldType.Date, pattern = "yyyy-MM-dd", format = DateFormat.date)
private LocalDate joinedDate;
}
ElasticSearch index properties
"mappings": {
"employee": {
"properties": {
"joinedDate": {
"format": "date",
"type": "date"
}
My Configuration file
#Configuration
#EnableElasticsearchRepositories("com.sample.dao")
public class ElasticSearchClientBuilder extends AbstractElasticsearchConfiguration{
Logger logger = LoggerFactory.getLogger(ElasticSearchClientBuilder.class);
#Override
#Bean
public RestHighLevelClient elasticsearchClient() {
//Configuration for ResthighClient
}
}
Error i am getting for above setting
Caused by: org.elasticsearch.client.ResponseException: method [PUT], host [https://ausdlcceesdb01.us.dell.com:9200], URI [/employee/employee/a77055df-2a79-4d8d-8911-315003bfed28?timeout=1m], status line [HTTP/1.1 400 Bad Request]
Warnings: [299 Elasticsearch-7.6.2-ef48eb35cf30adf4db14086e8aabd07ef6fb113f "[types removal] Specifying types in document index requests is deprecated, use the typeless endpoints instead (/{index}/_doc/{id}, /{index}/_doc, or /{index}/_create/{id})."]
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse field [joinedDate] of type [date] in document with id 'a77055df-2a79-4d8d-8911-315003bfed28'. Preview of field's value: '{dayOfWeek=THURSDAY, month=JANUARY, year=2022, dayOfMonth=6, era=CE, dayOfYear=6, monthValue=1, chronology={calendarType=iso8601, id=ISO}, leapYear=false}'"}],
"type":"mapper_parsing_exception","reason":"failed to parse field [joinedDate] of type [date] in document with id 'a77055df-2a79-4d8d-8911-315003bfed28'. Preview of field's value: '{dayOfWeek=THURSDAY, month=JANUARY, year=2022, dayOfMonth=6, era=CE, dayOfYear=6, monthValue=1, chronology={calendarType=iso8601, id=ISO}, leapYear=false}'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:83"}},
"status":400}
Without pattern in model my date is stores as below columns in index
joinedDate.year
joinedDate.month
joinedDate.dayOfMonth
joinedDate.dayOfWeek
joinedDate.era
joinedDate.dayOfYear
joinedDate.monthValue
joinedDate.chronology
joinedDate.leapYear
Please help how to store yyyy-MM-dd in index

I think you will need something like this:
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSS")
#JsonSerialize(using = CustomLocalDateTimeSerializer.class)
#JsonDeserialize(using = CustomLocalDateTimeDeserializer.class)
private LocalDateTime createDate;
CustomLocalDateTimeSerializer class:
public class CustomLocalDateTimeSerializer extends StdSerializer<LocalDateTime> {
public CustomLocalDateTimeSerializer() {
this(null);
}
private CustomLocalDateTimeSerializer(Class<LocalDateTime> t) {
super(t);
}
#Override
public void serialize(LocalDateTime value, JsonGenerator gen, SerializerProvider provider) throws IOException {
gen.writeString(value.format(DateTimeFormatter.ofPattern(Constants.DATE_FORMAT_SIMPLE)));
}
}
CustomLocalDateTimeDeserializer class:
public class CustomLocalDateTimeDeserializer extends StdDeserializer<LocalDateTime> {
public CustomLocalDateTimeDeserializer() {
this(null);
}
private CustomLocalDateTimeDeserializer(Class<LocalDateTime> t) {
super(t);
}
#Override
public LocalDateTime deserialize(JsonParser jsonParser, DeserializationContext ctxt) throws IOException, JsonProcessingException {
String date = jsonParser.getText();
try {
return LocalDateTime.parse(date);
} catch (Exception ex) {
log.debug("Error while parsing date: {} ", date, ex);
throw new RuntimeException("Cannot Parse Date");
}
}
}

Related

Problem with nomad-sdk DateTime deserialization (v 0.11.3.0)

Found a problem with a datetime de-serialization (nomad-sdk version 0.11.3.0).
Server(agent) version: Nomad v1.0.1 (c9c68aa55a7275f22d2338f2df53e67ebfcb9238)
When I try to get an allocation list from the nomad agent (via API) I get the following error:
Complete stack trace could be found here: https://pastebin.pl/view/9bf82a78
Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Can not deserialize value of type java.util.Date from String "2020-12-17T11:58:59.346780177+01:00": not a valid representation (error
: Failed to parse Date value '2020-12-17T11:58:59.346780177+01:00': Can not parse date "2020-12-17T11:58:59.346780177+0100": while it seems to fit format 'yyyy-MM-dd'T'HH:mm:ss.SSSZ', parsing fails (leni
ency? null))
Suggested/Tested workaround:
package com.hashicorp.nomad.apimodel;
import org.joda.time.format.DateTimeFormatter;
import org.joda.time.format.ISODateTimeFormat;
public class CustomDateDeserializer extends StdDeserializer<Date> {
public CustomDateDeserializer() {
super(Date.class);
}
#Override
public Date deserialize(com.fasterxml.jackson.core.JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
final String date = p.getText();
if (date.equals("0001-01-01T00:00:00Z")) {
return new Date();
}
DateTimeFormatter isoDateTimeFormat = ISODateTimeFormat.dateTime();
return isoDateTimeFormat.parseDateTime(date).toDate();
}
}
public abstract class NomadJson {
static {
OBJECT_MAPPER.setConfig(
OBJECT_MAPPER.getSerializationConfig()
.with(new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"))
);
SimpleModule simpleModule = new SimpleModule();
simpleModule.addDeserializer(Date.class, new CustomDateDeserializer());
OBJECT_MAPPER.registerModule(simpleModule);
}
}
//Added "CustomDateDeserializer" to AllocDeploymentStatus.java
public final class AllocDeploymentStatus extends ApiObject {
#JsonProperty("Timestamp")
#JsonDeserialize(using = CustomDateDeserializer.class)
public Date getTimestamp() {
return timestamp;
}
}

JPA criteria builder equal method is not working as expected

In my case I'm trying to fetch some data by extracting a value from a json column in db. My code is as follows,
criteriaBuilder.equal(criteriaBuilder.function("JSON_EXTRACT", Boolean.class, root.get("result"), criteriaBuilder.literal("$.matched")), false);
Above code gives me an empty set of data. Also this is working fine in query console.
But,
criteriaBuilder.between(criteriaBuilder.function("JSON_EXTRACT", Double.class, root.get("result"), criteriaBuilder.literal("$.streaming_threshold")), 0.1, 0.9);
this between method is working fine. What could be the mistake here?
UPDATE
Boolean values are the values that I couldn't read. NOT INTEGERS. My JSON structure,
{
"status": "SUCCESS",
"request_id": "request_id",
"time_taken": 8454,
"matched": false,
"streaming_threshold": 0.5
}
I was not able to get it working with the raw boolean field. I converted the Boolean property on the object to String and vice-versa using #JsonSerialize and #JsonDeserialize and then persisted that as json and followed the same approach you did but now searching String.class as false instead of Boolean. My solution is as below:
Entity
#Entity
#Table(name = "json_container")
public class JsonContainer {
#Id
#GeneratedValue
#Type(type = "uuid-char")
private UUID id;
#Column(columnDefinition = "json", name = "json_data")
private String jsonData;
public UUID getId() {
return id;
}
public String getJsonData() {
return jsonData;
}
public void setJsonData(String jsonData) {
this.jsonData = jsonData;
}
public static class SampleDetails {
private String status;
private String requestId;
private Integer timeTaken;
#JsonSerialize(using = StringBooleanJsonSerializer.class)
#JsonDeserialize(using = StringBooleanJsonDeserializer.class)
private Boolean matched;
private Double streamingThreshold;
public SampleDetails() {
}
public SampleDetails(String status, String requestId, Integer timeTaken, Boolean matched, Double streamingThreshold) {
this.status = status;
this.requestId = requestId;
this.timeTaken = timeTaken;
this.matched = matched;
this.streamingThreshold = streamingThreshold;
}
public String getStatus() {
return status;
}
public String getRequestId() {
return requestId;
}
public Integer getTimeTaken() {
return timeTaken;
}
public Double getStreamingThreshold() {
return streamingThreshold;
}
public Boolean getMatched() {
return matched;
}
static class StringBooleanJsonSerializer extends JsonSerializer<Boolean> {
#Override
public void serialize(Boolean value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
gen.writeString(value != null && value ? value.toString() : "false");
}
}
static class StringBooleanJsonDeserializer extends JsonDeserializer<Boolean> {
#Override
public Boolean deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
try {
return Boolean.parseBoolean(p.getText());
} catch (RuntimeException e) {
return Boolean.FALSE;
}
}
}
}
Test Class
class MySQLJsonConverterTest {
#Autowired
private EntityManager entityManager;
#Rollback(false)
#Test
void testCustomJsonConverter() throws JsonProcessingException {
JsonContainer jsonContainer = new JsonContainer();
jsonContainer.setJsonData(
getAsJson(new JsonContainer.SampleDetails("success", "12344567", 8454, false, 0.1)));
entityManager.persist(jsonContainer);
Assertions.assertNotNull(jsonContainer.getId());
jsonContainer = new JsonContainer();
jsonContainer.setJsonData(
getAsJson(new JsonContainer.SampleDetails("success", "8989", 121, true, 0.5)));
entityManager.persist(jsonContainer);
Assertions.assertNotNull(jsonContainer.getId());
CriteriaBuilder criteriaBuilder = entityManager.getCriteriaBuilder();
CriteriaQuery<JsonContainer> criteriaQuery = criteriaBuilder.createQuery(JsonContainer.class);
Root<JsonContainer> from = criteriaQuery.from(JsonContainer.class);
criteriaQuery.where(criteriaBuilder.equal(criteriaBuilder.function("JSON_EXTRACT", String.class, from.get("jsonData"),
criteriaBuilder.literal("$.matched")), "false"));
TypedQuery<JsonContainer> typedQuery = entityManager.createQuery(criteriaQuery);
List<JsonContainer> resultList = typedQuery.getResultList();
Assertions.assertEquals(1, resultList.size());
}
private String getAsJson(JsonContainer.SampleDetails sampleDetails) throws JsonProcessingException {
//var created so debugging is ez
String json = new ObjectMapper().writeValueAsString(sampleDetails);
return json;
}
}
Default JPA convert boolean to 0/1, so if serialize boolean to 0/1 into database, the equal query will be ok.

Elastic search spring boot ZonedDateTime converter not mapping correctly

I know a similar question has been asked before like this one but I think my problem is a bit different. I have an job board app that uses MongoDBas it's primary database, when data enters mongo is it synced to my Elasticsearch cluster automatically, all my date objects are ZonedDateTime due to the fact that even MongoDB does not have a converter for ZonedDateTime, I use this as my converter, I stored the datefield as an object with a dateTime, zoneId and offset, below is how it looks in the database
And the code responsible for that conversion is as follows:
My Writing converter
#WritingConverter
public class ZonedDateTimeToMongoDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, zonedDateTime.toInstant().getEpochSecond() * 1_000);
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
And my reading converter
#ReadingConverter
public class MongoDocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
And my Codec
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
This code works great issue though is when the data enters Elasticsearch, because of the nature of my date fields I had to opt for the datefields to be stored as objects as well and the mapping
{
"myDateField": {
"type": "object",
"properties": {
"dateTime": {
"type": "date"
},
"zone": {
"type": "keyword"
},
"offSet": {
"type": "keyword"
}
}
}
}
PS: Am new to Elasticsearch, this is in fact my first attempt at it.
Then I create my spring ZonedDateTime elastic converters as follows
Writing converter
#ReadingConverter
#RequiredArgsConstructor
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public ZonedDateTime convert(String source) {
ZoneDateTime zoneDateTime = objectMapper.readValue(source, ZoneDateTime.class);
return ZonedDateTime.ofInstant(zoneDateTime.getDateTime().toInstant(), ZoneId.of(zoneDateTime.getZone()));
}
}
Reading converter
#WritingConverter
#RequiredArgsConstructor
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public String convert(ZonedDateTime zonedDateTime) {
return objectMapper.writeValueAsString(ZoneDateTime.builder()
.dateTime(Date.from(zonedDateTime.toInstant()))
.zone(zonedDateTime.getZone().getId())
.offSet(zonedDateTime.getOffset().toString())
.build());
}
}
And my ZoneDateTime class looks like
#SuperBuilder(toBuilder = true)
#RequiredArgsConstructor
#Getter
public class ZoneDateTime {
private final Date dateTime;
private final String offSet;
private final String zone;
}
From my understanding this should work, but spring gives me this error
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [java.time.ZonedDateTime] for value '2020-08-08T14:32:22.094Z'; nested exception is com.fasterxml.jackson.core.JsonParseException: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: (String)"2020-08-08T14:32:22.094Z"; line: 1, column: 6]
And really not sure why Elasticsearch only sends the dateTime field from the dateField object leaving the zone and offset, elasticsearch ignores the fact that my datefield is an object. How can I fix this ?

Custom Object as a #RequestParam

I have a paginated endpoint that looks like this /api/clients?range=0-25.
I'd like the getClients() method in my ClientController to directly receive an instance of a custom Range object rather than having to validate a "0-25" String but I'm having trouble figuring this out.
#Getter
final class Range {
#Min(0)
private Integer offset = 0;
#Min(1)
private Integer limit = 25;
}
#ResponseBody
#GetMapping(params = { "range" })
public ResponseEntity<?> getAllClients(#RequestParam(value = "range", required = false) QueryRange queryRange, final HttpServletResponse response) {
...
}
I'm not sure how to instruct the Controller to correctly deserialize the "0-25" string into the Range...
You can use a Converter<S, T>, as shown below:
#Component
public class RangeConverter implements Converter<String, Range> {
#Override
public Range convert(String source) {
String[] values = source.split("-");
return new Range(Integer.valueOf(values[0]), Integer.valueOf(values[1]));
}
}
You also could handle invalid values according to your needs. If you use the above converter as is, the attempt to convert an invalid value such as 1-x will result in a ConversionFailedException.
You can also do the following it seems :
public class QueryRangeEditor extends PropertyEditorSupport {
private static final Pattern PATTERN = Pattern.compile("^([1-9]\\d*|0)-([1-9]\\d*|0)$");
#Override
public void setAsText(String text) throws IllegalArgumentException {
final QueryRange range = new QueryRange();
Matcher matcher = PATTERN.matcher(text);
if (matcher.find()) {
range.setOffset(Integer.valueOf(matcher.group(1)));
range.setLimit(Integer.valueOf(matcher.group(2)));
} else {
throw new IllegalArgumentException("OI"); // todo - replace
}
setValue(range);
}
}
#InitBinder
public void initBinder(WebDataBinder binder) {
binder.registerCustomEditor(QueryRange.class, new QueryRangeEditor());
}
But #cassiomolin's looks cleaner...

Rest Json Jackson Mapper Custom Object Mapper

I am having an issue with the Jackson Json mapper which I can't figure out how to solve.
I am having a Spring MVC Rest application and the endpoints are converted to Json using Jackson.
Some of the result objects contain a type that I want to tamper with before it gets converted.
More specifically, a result object could look like this.
ResultObject
- getDoubleMap() : DoubleMap
- getDoubleEntries() : List<DoubleEntry>
- toMap() : Map<String, Double>
What I want to do is to not have Jackson convert the DoubleMap instance but much rather override it like this
Object someJacksonMapInterceptor(Object object) {
if(object instanceof DoubleMap) {
return ((DoubleMap) object).toMap();
}
return object;
}
I have tortured google quite a while now and not a simple solution. Hope someone can advise.
Many thanks in advance.
In one application, we are custom-deserealizing date, probably you can use it for your custom deserealization.
public class VitalSign {
public static final String DATE_FORMAT1 = "yyyy-MM-dd'T'HH:mm:ssZ";
public static final String DATE_FORMAT2 = "yyyy-MM-dd'T'HH:mm:ss";
//public static final String DATE_FORMAT3 = "yyyy-MM-dd'T'HH:mm:ssTDZ";
public static final String DATE_FORMAT4 = "MMM dd, yyyy h:mm:ss aa";
#NotNull
#Column(name = "observed")
#Temporal(TemporalType.TIMESTAMP)
#DateTimeFormat(style = "M-")
#JsonDeserialize(using = CustomJsonDateDeserializer.class)
private Date timestamp;
public static class CustomJsonDateDeserializer extends JsonDeserializer<Date> {
public CustomJsonDateDeserializer() {
super();
}
#Override
public Date deserialize(JsonParser jsonparser, DeserializationContext deserializationcontext) throws IOException, JsonProcessingException {
SimpleDateFormat[] formats = { new SimpleDateFormat(DATE_FORMAT1), new SimpleDateFormat(DATE_FORMAT2), new SimpleDateFormat(DATE_FORMAT4, Locale.US) };
String date = jsonparser.getText();
for (SimpleDateFormat format : formats) {
try {
return format.parse(date);
} catch (ParseException e) {
}
}
throw new RuntimeException("Unparseable date " + date);
}
}
}
For serializing, you can just annotate your toMap() method with #JsonValue. For deserializing, if you have a static factory to create a DoubleMap from a Map<String, Double>, you can just annotate that with #JsonCreator.
private final ObjectMapper mapper = new ObjectMapper();
#Test
public void serialize_doublemap() throws Exception {
DoubleMap map = new DoubleMap();
map.put("red", 0.5);
map.put("orange", 0.7);
assertThat(mapper.writeValueAsString(map), equivalentTo("{ red: 0.5, orange: 0.7 }"));
}
#Test
public void deserialize_doublemap() throws Exception {
assertThat(mapper.readValue("{ \"red\": 0.5, \"orange\": 0.7 }", DoubleMap.class).toMap(),
equalTo(ImmutableMap.of("red", 0.5, "orange", 0.7)));
}
public static class DoubleMap {
public List<DoubleEntry> entries = new ArrayList<>();
public void put(String label, double value) {
entries.add(new DoubleEntry(label, value));
}
#JsonCreator
public static DoubleMap fromJson(Map<String, Double> input) {
DoubleMap map = new DoubleMap();
input.forEach(map::put);
return map;
}
public List<DoubleEntry> getDoubleEntries() {
return entries;
}
#JsonValue
public Map<String, Double> toMap() {
return entries.stream().collect(Collectors.toMap(e -> e.label, e -> e.value));
}
}
public static final class DoubleEntry {
public final String label;
public final double value;
public DoubleEntry(String label, double value) {
this.label = label;
this.value = value;
}
}

Resources