Problem with nomad-sdk DateTime deserialization (v 0.11.3.0) - nomad

Found a problem with a datetime de-serialization (nomad-sdk version 0.11.3.0).
Server(agent) version: Nomad v1.0.1 (c9c68aa55a7275f22d2338f2df53e67ebfcb9238)
When I try to get an allocation list from the nomad agent (via API) I get the following error:
Complete stack trace could be found here: https://pastebin.pl/view/9bf82a78
Caused by: com.fasterxml.jackson.databind.exc.InvalidFormatException: Can not deserialize value of type java.util.Date from String "2020-12-17T11:58:59.346780177+01:00": not a valid representation (error
: Failed to parse Date value '2020-12-17T11:58:59.346780177+01:00': Can not parse date "2020-12-17T11:58:59.346780177+0100": while it seems to fit format 'yyyy-MM-dd'T'HH:mm:ss.SSSZ', parsing fails (leni
ency? null))
Suggested/Tested workaround:
package com.hashicorp.nomad.apimodel;
import org.joda.time.format.DateTimeFormatter;
import org.joda.time.format.ISODateTimeFormat;
public class CustomDateDeserializer extends StdDeserializer<Date> {
public CustomDateDeserializer() {
super(Date.class);
}
#Override
public Date deserialize(com.fasterxml.jackson.core.JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
final String date = p.getText();
if (date.equals("0001-01-01T00:00:00Z")) {
return new Date();
}
DateTimeFormatter isoDateTimeFormat = ISODateTimeFormat.dateTime();
return isoDateTimeFormat.parseDateTime(date).toDate();
}
}
public abstract class NomadJson {
static {
OBJECT_MAPPER.setConfig(
OBJECT_MAPPER.getSerializationConfig()
.with(new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"))
);
SimpleModule simpleModule = new SimpleModule();
simpleModule.addDeserializer(Date.class, new CustomDateDeserializer());
OBJECT_MAPPER.registerModule(simpleModule);
}
}
//Added "CustomDateDeserializer" to AllocDeploymentStatus.java
public final class AllocDeploymentStatus extends ApiObject {
#JsonProperty("Timestamp")
#JsonDeserialize(using = CustomDateDeserializer.class)
public Date getTimestamp() {
return timestamp;
}
}

Related

how to format and store date in ElasticSearch

I am trying to store date value in ElasticSearch. BElow is my code
pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
Model class
#Document(indexName="employee", createIndex=true, shards = 4)
public class Employee {
#Nullable
#Field(type = FieldType.Date, pattern = "yyyy-MM-dd", format = DateFormat.date)
private LocalDate joinedDate;
}
ElasticSearch index properties
"mappings": {
"employee": {
"properties": {
"joinedDate": {
"format": "date",
"type": "date"
}
My Configuration file
#Configuration
#EnableElasticsearchRepositories("com.sample.dao")
public class ElasticSearchClientBuilder extends AbstractElasticsearchConfiguration{
Logger logger = LoggerFactory.getLogger(ElasticSearchClientBuilder.class);
#Override
#Bean
public RestHighLevelClient elasticsearchClient() {
//Configuration for ResthighClient
}
}
Error i am getting for above setting
Caused by: org.elasticsearch.client.ResponseException: method [PUT], host [https://ausdlcceesdb01.us.dell.com:9200], URI [/employee/employee/a77055df-2a79-4d8d-8911-315003bfed28?timeout=1m], status line [HTTP/1.1 400 Bad Request]
Warnings: [299 Elasticsearch-7.6.2-ef48eb35cf30adf4db14086e8aabd07ef6fb113f "[types removal] Specifying types in document index requests is deprecated, use the typeless endpoints instead (/{index}/_doc/{id}, /{index}/_doc, or /{index}/_create/{id})."]
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse field [joinedDate] of type [date] in document with id 'a77055df-2a79-4d8d-8911-315003bfed28'. Preview of field's value: '{dayOfWeek=THURSDAY, month=JANUARY, year=2022, dayOfMonth=6, era=CE, dayOfYear=6, monthValue=1, chronology={calendarType=iso8601, id=ISO}, leapYear=false}'"}],
"type":"mapper_parsing_exception","reason":"failed to parse field [joinedDate] of type [date] in document with id 'a77055df-2a79-4d8d-8911-315003bfed28'. Preview of field's value: '{dayOfWeek=THURSDAY, month=JANUARY, year=2022, dayOfMonth=6, era=CE, dayOfYear=6, monthValue=1, chronology={calendarType=iso8601, id=ISO}, leapYear=false}'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:83"}},
"status":400}
Without pattern in model my date is stores as below columns in index
joinedDate.year
joinedDate.month
joinedDate.dayOfMonth
joinedDate.dayOfWeek
joinedDate.era
joinedDate.dayOfYear
joinedDate.monthValue
joinedDate.chronology
joinedDate.leapYear
Please help how to store yyyy-MM-dd in index
I think you will need something like this:
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSS")
#JsonSerialize(using = CustomLocalDateTimeSerializer.class)
#JsonDeserialize(using = CustomLocalDateTimeDeserializer.class)
private LocalDateTime createDate;
CustomLocalDateTimeSerializer class:
public class CustomLocalDateTimeSerializer extends StdSerializer<LocalDateTime> {
public CustomLocalDateTimeSerializer() {
this(null);
}
private CustomLocalDateTimeSerializer(Class<LocalDateTime> t) {
super(t);
}
#Override
public void serialize(LocalDateTime value, JsonGenerator gen, SerializerProvider provider) throws IOException {
gen.writeString(value.format(DateTimeFormatter.ofPattern(Constants.DATE_FORMAT_SIMPLE)));
}
}
CustomLocalDateTimeDeserializer class:
public class CustomLocalDateTimeDeserializer extends StdDeserializer<LocalDateTime> {
public CustomLocalDateTimeDeserializer() {
this(null);
}
private CustomLocalDateTimeDeserializer(Class<LocalDateTime> t) {
super(t);
}
#Override
public LocalDateTime deserialize(JsonParser jsonParser, DeserializationContext ctxt) throws IOException, JsonProcessingException {
String date = jsonParser.getText();
try {
return LocalDateTime.parse(date);
} catch (Exception ex) {
log.debug("Error while parsing date: {} ", date, ex);
throw new RuntimeException("Cannot Parse Date");
}
}
}

Elastic search spring boot ZonedDateTime converter not mapping correctly

I know a similar question has been asked before like this one but I think my problem is a bit different. I have an job board app that uses MongoDBas it's primary database, when data enters mongo is it synced to my Elasticsearch cluster automatically, all my date objects are ZonedDateTime due to the fact that even MongoDB does not have a converter for ZonedDateTime, I use this as my converter, I stored the datefield as an object with a dateTime, zoneId and offset, below is how it looks in the database
And the code responsible for that conversion is as follows:
My Writing converter
#WritingConverter
public class ZonedDateTimeToMongoDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, zonedDateTime.toInstant().getEpochSecond() * 1_000);
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
And my reading converter
#ReadingConverter
public class MongoDocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
And my Codec
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
This code works great issue though is when the data enters Elasticsearch, because of the nature of my date fields I had to opt for the datefields to be stored as objects as well and the mapping
{
"myDateField": {
"type": "object",
"properties": {
"dateTime": {
"type": "date"
},
"zone": {
"type": "keyword"
},
"offSet": {
"type": "keyword"
}
}
}
}
PS: Am new to Elasticsearch, this is in fact my first attempt at it.
Then I create my spring ZonedDateTime elastic converters as follows
Writing converter
#ReadingConverter
#RequiredArgsConstructor
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public ZonedDateTime convert(String source) {
ZoneDateTime zoneDateTime = objectMapper.readValue(source, ZoneDateTime.class);
return ZonedDateTime.ofInstant(zoneDateTime.getDateTime().toInstant(), ZoneId.of(zoneDateTime.getZone()));
}
}
Reading converter
#WritingConverter
#RequiredArgsConstructor
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public String convert(ZonedDateTime zonedDateTime) {
return objectMapper.writeValueAsString(ZoneDateTime.builder()
.dateTime(Date.from(zonedDateTime.toInstant()))
.zone(zonedDateTime.getZone().getId())
.offSet(zonedDateTime.getOffset().toString())
.build());
}
}
And my ZoneDateTime class looks like
#SuperBuilder(toBuilder = true)
#RequiredArgsConstructor
#Getter
public class ZoneDateTime {
private final Date dateTime;
private final String offSet;
private final String zone;
}
From my understanding this should work, but spring gives me this error
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [java.time.ZonedDateTime] for value '2020-08-08T14:32:22.094Z'; nested exception is com.fasterxml.jackson.core.JsonParseException: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: (String)"2020-08-08T14:32:22.094Z"; line: 1, column: 6]
And really not sure why Elasticsearch only sends the dateTime field from the dateField object leaving the zone and offset, elasticsearch ignores the fact that my datefield is an object. How can I fix this ?

How to properly implement a Spring Converter?

I have a Money class with factory methods for numeric and String values. I would like to use it as a property of my input Pojos.
I created some Converters for it, this is the String one:
#Component
public class StringMoneyConverter implements Converter<String, Money> {
#Override
public Money convert(String source) {
return Money.from(source);
}
}
My testing Pojo is very simple:
public class MoneyTestPojo {
private Money value;
//getter and setter ommited
}
I have an endpoint which expects a Pojo:
#PostMapping("/pojo")
public String savePojo(#RequestBody MoneyTestPojo pojo) {
//...
}
Finally, this is the request body:
{
value: "100"
}
I have the following error when I try this request:
JSON parse error: Cannot construct instance of
br.marcellorvalle.Money (although at least one Creator
exists): no String-argument constructor/factory method to deserialize
from String value ('100'); nested exception is
com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot
construct instance of br.marcellorvalle.Money (although at
least one Creator exists): no String-argument constructor/factory
method to deserialize from String value ('100')\n at [Source:
(PushbackInputStream); line: 8, column: 19] (through reference chain:
br.marcellorvalle.MoneytestPojo[\"value\"])",
If I change Money and add a constructor which receives a String this request works but I really need a factory method as I have to deliver special instances of Money on specific cases (zeros, nulls and empty strings).
Am I missing something?
Edit: As asked, here goes the Money class:
public class Money {
public static final Money ZERO = new Money(BigDecimal.ZERO);
private static final int PRECISION = 2;
private static final int EXTENDED_PRECISION = 16;
private static final RoundingMode ROUNDING = RoundingMode.HALF_EVEN;
private final BigDecimal amount;
private Money(BigDecimal amount) {
this.amount = amount;
}
public static Money from(float value) {
return Money.from(BigDecimal.valueOf(value));
}
public static Money from(double value) {
return Money.from(BigDecimal.valueOf(value));
}
public static Money from(String value) {
if (Objects.isNull(value) || "".equals(value)) {
return null;
}
return Money.from(new BigDecimal(value));
}
public static Money from(BigDecimal value) {
if (Objects.requireNonNull(value).equals(BigDecimal.ZERO)) {
return Money.ZERO;
}
return new Money(value);
}
//(...)
}
Annotating your factory method with #JsonCreator (from the com.fasterxml.jackson.annotation package) will resolve the issue:
#JsonCreator
public static Money from(String value) {
if (Objects.isNull(value) || "".equals(value)) {
return null;
}
return Money.from(new BigDecimal(value));
}
I just tested it, and it worked for me. Rest of your code looks fine except for the sample request (value should be in quotes), but I guess that's just a typo.
Update 1:
If you're unable to make changes to the Money class, I can think of another option - a custom Jackson deserializer:
public class MoneyDeserializer extends StdDeserializer<Money> {
private static final long serialVersionUID = 0L;
public MoneyDeserializer() {
this(null);
}
public MoneyDeserializer(Class<?> vc) {
super(vc);
}
#Override
public Money deserialize(JsonParser jp, DeserializationContext ctxt)
throws IOException, JsonProcessingException {
JsonNode node = jp.getCodec().readTree(jp);
String value = node.textValue();
return Money.from(value);
}
}
Just register it with your ObjectMapper.
It seems that using the org.springframework.core.convert.converter.Converter only works if the Money class is a "#PathVariable" in the controller.
I finally solved it using the com.fasterxml.jackson.databind.util.StdConverter class:
I created the following Converter classes:
public class MoneyJsonConverters {
public static class FromString extends StdConverter<String, Money> {
#Override
public Money convert(String value) {
return Money.from(value);
}
}
public static class ToString extends StdConverter<Money, String> {
#Override
public String convert(Money value) {
return value.toString();
}
}
}
Then I annotated the Pojo with #JsonDeserialize #JsonSerialize accordingly:
public class MoneyTestPojo {
#JsonSerialize(converter = MoneyJsonConverters.ToString.class)
#JsonDeserialize(converter = MoneyJsonConverters.FromString.class)
private Money value;
//getter and setter ommited
}

Custom Object as a #RequestParam

I have a paginated endpoint that looks like this /api/clients?range=0-25.
I'd like the getClients() method in my ClientController to directly receive an instance of a custom Range object rather than having to validate a "0-25" String but I'm having trouble figuring this out.
#Getter
final class Range {
#Min(0)
private Integer offset = 0;
#Min(1)
private Integer limit = 25;
}
#ResponseBody
#GetMapping(params = { "range" })
public ResponseEntity<?> getAllClients(#RequestParam(value = "range", required = false) QueryRange queryRange, final HttpServletResponse response) {
...
}
I'm not sure how to instruct the Controller to correctly deserialize the "0-25" string into the Range...
You can use a Converter<S, T>, as shown below:
#Component
public class RangeConverter implements Converter<String, Range> {
#Override
public Range convert(String source) {
String[] values = source.split("-");
return new Range(Integer.valueOf(values[0]), Integer.valueOf(values[1]));
}
}
You also could handle invalid values according to your needs. If you use the above converter as is, the attempt to convert an invalid value such as 1-x will result in a ConversionFailedException.
You can also do the following it seems :
public class QueryRangeEditor extends PropertyEditorSupport {
private static final Pattern PATTERN = Pattern.compile("^([1-9]\\d*|0)-([1-9]\\d*|0)$");
#Override
public void setAsText(String text) throws IllegalArgumentException {
final QueryRange range = new QueryRange();
Matcher matcher = PATTERN.matcher(text);
if (matcher.find()) {
range.setOffset(Integer.valueOf(matcher.group(1)));
range.setLimit(Integer.valueOf(matcher.group(2)));
} else {
throw new IllegalArgumentException("OI"); // todo - replace
}
setValue(range);
}
}
#InitBinder
public void initBinder(WebDataBinder binder) {
binder.registerCustomEditor(QueryRange.class, new QueryRangeEditor());
}
But #cassiomolin's looks cleaner...

Rest Json Jackson Mapper Custom Object Mapper

I am having an issue with the Jackson Json mapper which I can't figure out how to solve.
I am having a Spring MVC Rest application and the endpoints are converted to Json using Jackson.
Some of the result objects contain a type that I want to tamper with before it gets converted.
More specifically, a result object could look like this.
ResultObject
- getDoubleMap() : DoubleMap
- getDoubleEntries() : List<DoubleEntry>
- toMap() : Map<String, Double>
What I want to do is to not have Jackson convert the DoubleMap instance but much rather override it like this
Object someJacksonMapInterceptor(Object object) {
if(object instanceof DoubleMap) {
return ((DoubleMap) object).toMap();
}
return object;
}
I have tortured google quite a while now and not a simple solution. Hope someone can advise.
Many thanks in advance.
In one application, we are custom-deserealizing date, probably you can use it for your custom deserealization.
public class VitalSign {
public static final String DATE_FORMAT1 = "yyyy-MM-dd'T'HH:mm:ssZ";
public static final String DATE_FORMAT2 = "yyyy-MM-dd'T'HH:mm:ss";
//public static final String DATE_FORMAT3 = "yyyy-MM-dd'T'HH:mm:ssTDZ";
public static final String DATE_FORMAT4 = "MMM dd, yyyy h:mm:ss aa";
#NotNull
#Column(name = "observed")
#Temporal(TemporalType.TIMESTAMP)
#DateTimeFormat(style = "M-")
#JsonDeserialize(using = CustomJsonDateDeserializer.class)
private Date timestamp;
public static class CustomJsonDateDeserializer extends JsonDeserializer<Date> {
public CustomJsonDateDeserializer() {
super();
}
#Override
public Date deserialize(JsonParser jsonparser, DeserializationContext deserializationcontext) throws IOException, JsonProcessingException {
SimpleDateFormat[] formats = { new SimpleDateFormat(DATE_FORMAT1), new SimpleDateFormat(DATE_FORMAT2), new SimpleDateFormat(DATE_FORMAT4, Locale.US) };
String date = jsonparser.getText();
for (SimpleDateFormat format : formats) {
try {
return format.parse(date);
} catch (ParseException e) {
}
}
throw new RuntimeException("Unparseable date " + date);
}
}
}
For serializing, you can just annotate your toMap() method with #JsonValue. For deserializing, if you have a static factory to create a DoubleMap from a Map<String, Double>, you can just annotate that with #JsonCreator.
private final ObjectMapper mapper = new ObjectMapper();
#Test
public void serialize_doublemap() throws Exception {
DoubleMap map = new DoubleMap();
map.put("red", 0.5);
map.put("orange", 0.7);
assertThat(mapper.writeValueAsString(map), equivalentTo("{ red: 0.5, orange: 0.7 }"));
}
#Test
public void deserialize_doublemap() throws Exception {
assertThat(mapper.readValue("{ \"red\": 0.5, \"orange\": 0.7 }", DoubleMap.class).toMap(),
equalTo(ImmutableMap.of("red", 0.5, "orange", 0.7)));
}
public static class DoubleMap {
public List<DoubleEntry> entries = new ArrayList<>();
public void put(String label, double value) {
entries.add(new DoubleEntry(label, value));
}
#JsonCreator
public static DoubleMap fromJson(Map<String, Double> input) {
DoubleMap map = new DoubleMap();
input.forEach(map::put);
return map;
}
public List<DoubleEntry> getDoubleEntries() {
return entries;
}
#JsonValue
public Map<String, Double> toMap() {
return entries.stream().collect(Collectors.toMap(e -> e.label, e -> e.value));
}
}
public static final class DoubleEntry {
public final String label;
public final double value;
public DoubleEntry(String label, double value) {
this.label = label;
this.value = value;
}
}

Resources