Convert String to LocalDateTime with zoned time - spring-boot

I received a String field in a topic with date and offset and I need to convert this String to a LocalDateTime by adding the offset. For example, if I received:
2021-07-20T19:00:00.000+02:00
I want to convert in LocalDateTime:
2021-07-20T21:00:00.000
And I have a Bean with custom object mapper for this purpose:
#Configuration
public class MyConfiguration {
#Bean
public MyCustomObjectMapper configure() {
final ObjectMapper mapper = new ObjectMapper();
mapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
final DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXXXX");
final LocalDateTimeDeserializer dateTimeDeserializer = new LocalDateTimeDeserializer(formatter);
final LocalDateTimeSerializer dateTimeSerializer = new LocalDateTimeSerializer(formatter);
final JavaTimeModule javaTimeModule = new JavaTimeModule();
javaTimeModule.addDeserializer(LocalDateTime.class, dateTimeDeserializer);
javaTimeModule.addSerializer(LocalDateTime.class, dateTimeSerializer);
mapper.registerModule(javaTimeModule);
return new MyCustomObjectMapper (mapper);
}
}
But it doesn't work as I expect, since the resulting LocalDateTime offset disappears and is not added:
2021-07-20T19:00:00.000
How can I achieve this goal?

The LocalDateTime class is a date-time representation which is unaware of time zones and it's only logical that the LocalDateTimeDeserializer ignores any time zone information in the source data.
To account for the time zone you could use the InstantDeserializer.OFFSET_DATE_TIME deserializer (DateTimeFormatter.ISO_OFFSET_DATE_TIME is actually the format of the source date time you have) and have its result converted to LocalDateTime within a desired zone. This can be wrapped in a custom deserializer for ease of use, e.g.
class SmartLocalDateTimeDeserializer extends StdDeserializer<LocalDateTime> {
private final InstantDeserializer<OffsetDateTime> delegate = InstantDeserializer.OFFSET_DATE_TIME;
public SmartLocalDateTimeDeserializer() {
super(LocalDateTime.class);
}
#Override
public LocalDateTime deserialize(JsonParser p,
DeserializationContext ctxt) throws IOException, JsonProcessingException {
final OffsetDateTime result = delegate.deserialize(p, ctxt);
return result.atZoneSameInstant(ZoneId.systemDefault()).toLocalDateTime();
}
}
...
javaTimeModule.addDeserializer(LocalDateTime.class, new SmartLocalDateTimeDeserializer());

Related

How do I parse snake case fields in a FeignClient response json?

I have configured a FeignClient in my spring boot webapp where I'm calling an external api that returns the following object.
public class Issue {
private Assignee assignee;
private Date createdAt;
private Date updatedAt;
private Date closedAt;
private String description;
private Date dueDate;
public Assignee getAssignee() {
return assignee;
}
public void setAssignee(Assignee assignee) {
this.assignee = assignee;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Date getDueDate() {
return dueDate;
}
public void setDueDate(Date dueDate) {
this.dueDate = dueDate;
}
public Date getUpdatedAt() {
return updatedAt;
}
public void setUpdatedAt(Date updatedAt) {
this.updatedAt = updatedAt;
}
public Date getClosedAt() {
return closedAt;
}
public void setClosedAt(Date closedAt) {
this.closedAt = closedAt;
}
#Override
public String toString() {
return (JacksonJson.toJsonString(this));
}
}
The fields updatedAt, createdAt and closedAt are all in snake case. All multi-word fields show up as null. Is there any way of configuring the FeignClient's Jackson parser so that it can process snake case characters? Note, that I cannot change the default Jackson Parser for my spring boot webapp because I myself render json in camel case. I just need to configure this parser on the FeignClient that I'm using to connect to an external REST api.
I have verified that the json response returned from the api call contains valid values in each of these json fields.
Here's how I solved it. I created a custom JacksonParser as a Spring Bean.
#Configuration(proxyBeanMethods = false)
public class FeignClientDateFormatConfig {
#Bean
public Decoder feignDecoder() {
HttpMessageConverter jacksonConverter = new MappingJackson2HttpMessageConverter(customObjectMapper());
ObjectFactory<HttpMessageConverters> objectFactory = () -> new HttpMessageConverters(jacksonConverter);
return new ResponseEntityDecoder(new SpringDecoder(objectFactory));
}
public ObjectMapper customObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setPropertyNamingStrategy(PropertyNamingStrategy.SNAKE_CASE);
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
return objectMapper;
}
}
This successfully parses all snake case properties.
Please note that this has a severe limitation. If you have multiple FeignClients and only one of them returns snake-case json, then you're out of luck. This overrides the default FeignClient config. The only workaround possible with this solution is to move your FeignClient calls into a separate microservice so other FeignClient calls are not affected.

Elastic search spring boot ZonedDateTime converter not mapping correctly

I know a similar question has been asked before like this one but I think my problem is a bit different. I have an job board app that uses MongoDBas it's primary database, when data enters mongo is it synced to my Elasticsearch cluster automatically, all my date objects are ZonedDateTime due to the fact that even MongoDB does not have a converter for ZonedDateTime, I use this as my converter, I stored the datefield as an object with a dateTime, zoneId and offset, below is how it looks in the database
And the code responsible for that conversion is as follows:
My Writing converter
#WritingConverter
public class ZonedDateTimeToMongoDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, zonedDateTime.toInstant().getEpochSecond() * 1_000);
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
And my reading converter
#ReadingConverter
public class MongoDocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
And my Codec
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
This code works great issue though is when the data enters Elasticsearch, because of the nature of my date fields I had to opt for the datefields to be stored as objects as well and the mapping
{
"myDateField": {
"type": "object",
"properties": {
"dateTime": {
"type": "date"
},
"zone": {
"type": "keyword"
},
"offSet": {
"type": "keyword"
}
}
}
}
PS: Am new to Elasticsearch, this is in fact my first attempt at it.
Then I create my spring ZonedDateTime elastic converters as follows
Writing converter
#ReadingConverter
#RequiredArgsConstructor
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public ZonedDateTime convert(String source) {
ZoneDateTime zoneDateTime = objectMapper.readValue(source, ZoneDateTime.class);
return ZonedDateTime.ofInstant(zoneDateTime.getDateTime().toInstant(), ZoneId.of(zoneDateTime.getZone()));
}
}
Reading converter
#WritingConverter
#RequiredArgsConstructor
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public String convert(ZonedDateTime zonedDateTime) {
return objectMapper.writeValueAsString(ZoneDateTime.builder()
.dateTime(Date.from(zonedDateTime.toInstant()))
.zone(zonedDateTime.getZone().getId())
.offSet(zonedDateTime.getOffset().toString())
.build());
}
}
And my ZoneDateTime class looks like
#SuperBuilder(toBuilder = true)
#RequiredArgsConstructor
#Getter
public class ZoneDateTime {
private final Date dateTime;
private final String offSet;
private final String zone;
}
From my understanding this should work, but spring gives me this error
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [java.time.ZonedDateTime] for value '2020-08-08T14:32:22.094Z'; nested exception is com.fasterxml.jackson.core.JsonParseException: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: (String)"2020-08-08T14:32:22.094Z"; line: 1, column: 6]
And really not sure why Elasticsearch only sends the dateTime field from the dateField object leaving the zone and offset, elasticsearch ignores the fact that my datefield is an object. How can I fix this ?

Is there a way to configure LocalDate format for serializing and deserializing in the whole spring application?

I have the following problem I hope someone can give me a hand:
Context: 3 Rest endpoints
Create (register)
Find (findKid)
Report (listDashboardInfo)
Requirement: Use the same date format yyyyMMdd for LocalDates in the whole application
Problem: Using #DateTimeFormat(pattern = DateUtils.SHORT_DATE_PATTERN) works for register and listDashboardInfo but not for findKid
These are the relevant parts of the code:
BODY
{
"sailDate": "20191201"
}
#PostMapping(KID_PATH)
#ResponseStatus(HttpStatus.CREATED)
public KidDTO register(#RequestBody #Valid KidDTO kid) {
return kidService.saveKid(kid);
}
GET /kid/0001::20190901
RESPONSE
{
"sailDate": "2019-09-01"
}
#GetMapping(KID_FIND_PATH)
public CompletableFuture<KidDTO> findKid(#PathVariable String id) {
return kidService.findKid(id);
}
GET /kid?shipCode=AL&sailDate=20190901
#GetMapping(KID_LIST_PATH)
public CompletableFuture<Slice<DashboardDTO>> listDashboardInfo(#Valid DashboardFilter filter, Pageable pageable) {
return kidService.listKidsWithStatistics(filter, pageable);
}
#Getter
#Setter
public class DashboardFilter {
#NotNull
#DateTimeFormat(pattern = DateUtils.SHORT_DATE_PATTERN)
private LocalDate sailDate;
}
#Data
public class KidDTO {
#NotNull
#DateTimeFormat(pattern = DateUtils.SHORT_DATE_PATTERN)
private LocalDate sailDate;
}
Tests I did:
spring.jackson.date-format in application.properties: From https://blog.codecentric.de/en/2017/08/parsing-of-localdate-query-parameters-in-spring-boot/ this just apply for Date not LocalDate.
Using #JsonFormat(pattern = DateUtils.SHORT_DATE_PATTERN) the listDashboardInfo doesn't recognize the format and generates error
From stackoverflow I also found Spring doesn't use Jackson to deserialize query params so:
- I created a #ControllerAdvice with #InitBinder but the method setAsText is never called:
#ControllerAdvice
public class GlobalDateBinder {
#InitBinder
public void binder(WebDataBinder binder) {
binder.registerCustomEditor(LocalDate.class, new PropertyEditorSupport() {
#Override
public void setAsText(String text) throws IllegalArgumentException {
LocalDate.parse(text, DateUtils.SHORT_DATE_FORMATTER);
}
});
}
}
Also I tried with a #Bean public Formatter<LocalDate> localDateFormatter() but nothing change:
#Bean
public FormattingConversionService conversionService() {
DefaultFormattingConversionService conversionService =
new DefaultFormattingConversionService(false);
DateTimeFormatterRegistrar registrar = new DateTimeFormatterRegistrar();
registrar.setDateFormatter(DateUtils.SHORT_DATE_FORMATTER);
registrar.registerFormatters(conversionService);
return conversionService;
}
#Bean
public Formatter<LocalDate> localDateFormatter() {
return new Formatter<LocalDate>() {
#Override
public LocalDate parse(String text, Locale locale) {
return LocalDate.parse(text, DateUtils.SHORT_DATE_FORMATTER);
}
#Override
public String print(LocalDate object, Locale locale) {
return DateUtils.SHORT_DATE_FORMATTER.format(object);
}
};
}
Any one has an idea of what is happening?
how to make the response of findKid be formatted?
How to configure the whole application with the same date format to works in serialization and parsing/deserializing processes?
UPDATE:
I found here https://stackoverflow.com/questions/30871255/spring-boot-localdate-field-serialization-and-deserialization that I can use #JsonFormat for rest controllers (serialize and deserialize) and #DateTimeFormat for ModelView controllers but using both, at the same time, fixed my error so I don't understand why is that behavior if I only have rest controllers. Looks like in my case #DateTimeFormat deserialize and #JsonFormat serialize, is that the expected behavior? Is there any misconfiguration?
you can add this bean to you configuration:
#Bean
public ObjectMapper objectMapper() {
DateTimeFormatter dateFormatter; // create your date formatter
DateTimeFormatter dateTimeFormatter; // create your date and time formatter
ObjectMapper mapper = new ObjectMapper();
SimpleModule localDateModule = new SimpleModule();
localDateModule.addDeserializer(LocalDate.class,
new LocalDateDeserializer(formatter));
localDateModule.addSerializer(LocalDate.class,
new LocalDateSerializer(formatter));
localDateModule.addDeserializer(LocalDateTime.class,
new LocalDateTimeDeserializer(dateTimeFormatter));
localDateModule.addSerializer(LocalDateTime.class,
new LocalDateTimeSerializer(dateTimeFormatter));
mapper.registerModules(localDateModule);
return mapper;
}
Just set the property spring.jackson.date-format to any format you want inside you application.properties or application.yml.
Example with application.properties:
spring.jackson.date-format=yyyyMMdd
Example with application.yml:
spring:
jackson:
date-format: yyyyMMdd
Source and other available properties: https://docs.spring.io/spring-boot/docs/current/reference/html/common-application-properties.html

CodecConfigurationException when saving ZonedDateTime to MongoDB with Spring Boot >= 2.0.1.RELEASE

I was able to reproduce my problem with a minimal modification of the official Spring Boot guide for Accessing Data with MongoDB, see https://github.com/thokrae/spring-data-mongo-zoneddatetime.
After adding a java.time.ZonedDateTime field to the Customer class, running the example code from the guide fails with a CodecConfigurationException:
Customer.java:
public String lastName;
public ZonedDateTime created;
public Customer() {
output:
...
Caused by: org.bson.codecs.configuration.CodecConfigurationException`: Can't find a codec for class java.time.ZonedDateTime.
at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ChildCodecRegistry.get(ChildCodecRegistry.java:51) ~[bson-3.6.4.jar:na]
This can be solved by changing the Spring Boot version from 2.0.5.RELEASE to 2.0.1.RELEASE in the pom.xml:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.1.RELEASE</version>
</parent>
Now the exception is gone and the Customer objects including the ZonedDateTime fields are written to MongoDB.
I filed a bug (DATAMONGO-2106) with the spring-data-mongodb project but would understand if changing this behaviour is not wanted nor has a high priority.
What is the best workaround? When duckduckgoing for the exception message I find several approaches like registering a custom codec, a custom converter or using Jackson JSR 310. I would prefer to not add custom code to my project to handle a class from the java.time package.
Persisting date time types with time zones was never supported by Spring Data MongoDB, as stated by Oliver Drotbohm himself in DATAMONGO-2106.
These are the known workarounds:
Use a date time type without a time zone, e.g. java.time.Instant. (It is generally advisable to only use UTC in the backend, but I had to extend an existing code base which was following a different approach.)
Write a custom converter and register it by extending AbstractMongoConfiguration. See the branch converter in my test repository for a running example.
#Component
#WritingConverter
public class ZonedDateTimeToDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, Date.from(zonedDateTime.toInstant()));
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
#Component
#ReadingConverter
public class DocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
#Override
protected String getDatabaseName() {
return database;
}
#Bean
public CustomConversions customConversions() {
return new MongoCustomConversions(asList(
new ZonedDateTimeToDocumentConverter(),
new DocumentToZonedDateTimeConverter()
));
}
}
Write a custom codec. At least in theory. My codec test branch is unable to unmarshal the data when using Spring Boot 2.0.5 while working fine with Spring Boot 2.0.1.
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host + ":" + port, createOptions());
}
private MongoClientOptions createOptions() {
CodecProvider pojoCodecProvider = PojoCodecProvider.builder()
.automatic(true)
.build();
CodecRegistry registry = CodecRegistries.fromRegistries(
createCustomCodecRegistry(),
MongoClient.getDefaultCodecRegistry(),
CodecRegistries.fromProviders(pojoCodecProvider)
);
return MongoClientOptions.builder()
.codecRegistry(registry)
.build();
}
private CodecRegistry createCustomCodecRegistry() {
return CodecRegistries.fromCodecs(
new ZonedDateTimeCodec()
);
}
#Override
protected String getDatabaseName() {
return database;
}
}

Validate input before Jackson in Spring Boot

I've built a REST endpoint using Spring Boot. JSON is posted to the endpoint. Jackson converts the JSON giving me an object.
The JSON look like this:
{
"parameterDateUnadjusted": "2017-01-01",
"parameterDateAdjusted": "2017-01-02"
}
Jackson converts the JSON to an object based on this class:
public class ParameterDate {
#NotNull(message = "Parameter Date Unadjusted can not be blank or null")
#DateTimeFormat(pattern = "yyyy-MM-dd")
private Date parameterDateUnadjusted;
#NotNull(message = "Parameter Date Adjusted can not be blank or null")
#DateTimeFormat(pattern = "yyyy-MM-dd")
private Date parameterDateAdjusted;
private Date parameterDateAdded;
private Date parameterDateChanged;
}
This all works fine. The issue I'm having is that I would like to validate the data before Jackson converts the data. For instance if I post
{
"parameterDateUnadjusted": "2017-01-01",
"parameterDateAdjusted": "2017-01-40"
}
Where parameterDateAdjusted is not a valid date (there is no month with 40 days in it). Jackson converts this to 2017-02-09. One way of getting around this is to have a class that is only strings let's call it ParameterDateInput. Validate each filed with Hibernate Validator in the parameterDateInput object and then copy the parameterDateInput object to parameterDate where each field has the correct type (dates are of type Date and not of type String). This to me doesn't look like a very elegant solution. Is there some other way I can solve this? How is data generally validated in Spring Boot when posted as JSON? I like to be able to send back a message to the user/client what is wrong with the data that is being posted.
How about a custom JSON deserializer where you can write down the logic you want:
#RestController
public class JacksonCustomDesRestEndpoint {
#RequestMapping(value = "/yourEndPoint", method = RequestMethod.POST, consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE)
#ResponseBody
public Object createRole(#RequestBody ParameterDate paramDate) {
return paramDate;
}
}
#JsonDeserialize(using = RoleDeserializer.class)
public class ParameterDate {
// ......
}
public class RoleDeserializer extends JsonDeserializer<ParameterDate> {
#Override
public ParameterDate deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
ObjectCodec oc = jsonParser.getCodec();
JsonNode node = oc.readTree(jsonParser);
String parameterDateUnadjusted = node.get("parameterDateUnadjusted").getTextValue();
//Do what you want with the date and set it to object from type ParameterDate and return the object at the end.
//Don't forget to fill all the properties to this object because you do not want to lose data that came from the request.
return something;
}
}
There is a way to check the dates. setLenient() method
public static boolean isValidDate(String inDate, String format) {
SimpleDateFormat dateFormat = new SimpleDateFormat(format);
dateFormat.setLenient(false);
try {
dateFormat.parse(inDate.trim());
} catch (ParseException pe) {
return false;
}
return true;
}
Just define own annotation to validate the value
#Target({ FIELD, METHOD, PARAMETER, ANNOTATION_TYPE })
#Retention(RUNTIME)
#Constraint(validatedBy = MyDateFormatCheckValidator.class)
#Documented
public #interface MyDateFormatCheck {
String pattern();
...
and the validator class
public class MyDateFormatCheckValidator implements ConstraintValidator<MyDateFormatCheck, String> {
private MyDateFormatCheck check;
#Override
public void initialize(MyDateFormatCheck constraintAnnotation) {
this.check= constraintAnnotation;
}
#Override
public boolean isValid(String object, ConstraintValidatorContext constraintContext) {
if ( object == null ) {
return true;
}
return isValidDate(object, check.pattern());
}
}

Resources