Spring Data Elasticsearch Repository query define date input parameter format - spring

I am using elasticsearch 6.5.3 and Spring Boot 2.1.6 and spring-data-elasticsearch 3.2.0.M1.
I have defined the Elasticsearch configuration as:
#Bean
public ElasticsearchOperations elasticsearchTemplate() {
return new ElasticsearchRestTemplate(client(), new CustomEntityMapper());
}
public static class CustomEntityMapper implements EntityMapper {
private final ObjectMapper objectMapper;
public CustomEntityMapper() {
//we use this so that Elasticsearch understands LocalDate and LocalDateTime objects
objectMapper = new ObjectMapper()
.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES)
.enable(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY)
.disable(DeserializationFeature.READ_DATE_TIMESTAMPS_AS_NANOSECONDS)
.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
//MUST be registered BEFORE calling findAndRegisterModules
.registerModule(new JavaTimeModule())
.registerModule(new Jdk8Module());
//only autodetect fields and ignore getters and setters for nonexistent fields when serializing/deserializing
objectMapper.setVisibility(objectMapper.getSerializationConfig().getDefaultVisibilityChecker()
.withFieldVisibility(JsonAutoDetect.Visibility.ANY)
.withGetterVisibility(JsonAutoDetect.Visibility.NONE)
.withSetterVisibility(JsonAutoDetect.Visibility.NONE)
.withCreatorVisibility(JsonAutoDetect.Visibility.NONE));
//load the other available modules as well
objectMapper.findAndRegisterModules();
}
#Override
public String mapToString(Object object) throws IOException {
return objectMapper.writeValueAsString(object);
}
#Override
public <T> T mapToObject(String source, Class<T> clazz) throws IOException {
return objectMapper.readValue(source, clazz);
}
}
I have a repository with a method defined as:
List<AccountDateRollSchedule> findAllByNextRollDateTimeLessThanEqual(final LocalDateTime dateTime);
And the POJO AccountDateRollSchedule defines that field as:
#Field(type = FieldType.Date, format = DateFormat.date_hour_minute)
#DateTimeFormat(pattern = "yyyy-MM-dd'T'HH:mm")
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd'T'HH:mm")
private LocalDateTime nextRollDateTime;
I see my index properly has that field created as declared and expected:
"nextRollDateTime": {
"type": "date",
"format": "date_hour_minute"
}
Also querying the index returns the field formatted as expected:
"nextRollDateTime" : "2019-06-27T13:34"
My repository query would translate to:
{"query":
{"bool" :
{"must" :
{"range" :
{"nextRollDateTime" :
{"from" : null,
"to" : "?0",
"include_lower" : true,
"include_upper" : true
}
}
}
}
}
}
But passing any LocalDateTime input to the method does NOT respect the format defined for the field, the FULL format is always used instead. Invoking:
findAllByNextRollDateTimeLessThanEqual(LocalDateTime.now(ZoneOffset.UTC).truncatedTo(ChronoUnit.MINUTES));
gives me the following exception (any #DateTimeFormat or #JsonFormat annotation on the method parameter in the repository is ignored):
Unrecognized chars at the end of [2019-07-22T09:07:00.000]: [:00.000]
If I instead change the repository method to accept a String and pass a String formatted exactly as expected as input to it, it works no problem.
Is it possible to somehow define the format used for the date parameter passed in input to the repository method or have Spring use the one configured on the field itself?
I would like not to wrap that method for a simple conversion like this (I did and it works), and I would also like to avoid using long type for the date field
Thanks and cheers
For reference, I also open issue on Spring JIRA

These problems are one reason why we move away from using and exposing the JacksonMapper in Spring Data Elasticsearch. From version 4.0 on all you need on your property is the one annotation:
#Field(type = FieldType.Date, format = DateFormat.date_hour_minute)
private LocalDateTime nextRollDateTime;
This will then be used in writing the index mappings, when entities are indexed and retrieved, and also when repository method and queries are processed.
But for the 3.2.x version you will have to use a workaround like the wrapping you mentioned.

Related

Spring boot : Failed to save List<String>

Please I'm trying to save List in spring boot I failed to do it.
this is the entity :
#Column(name="services")
#ElementCollection
#NotBlank
private List<String> services = new ArrayList<String>();
This is Postman :
Postman result :
issue :
: No validator could be found for constraint 'javax.validation.constraints.NotBlank' validating type 'java.util.List<java.lang.String>'
I edit the function to save like this :
List<String> lstServices= new ArrayList<>();
carePost.getServices().forEach(item -> {
lstServices.add(item);
});
carePost.setServices(lstServices);
but still same error
Controller method :
#PostMapping("/addcarepost")
public ResponseEntity<?> createCarePost(#RequestBody CarePost carePost){
carePost.setCareserviceId(carePost.getCareserviceId());
List<String> lstServices= new ArrayList<>();
carePost.getService().forEach(item -> {
lstServices.add(item);
});
carePost.setService(lstServices);
carePostService.save(carePost);
return ResponseEntity.ok(new MessageResponse("CarePost registered successfully!"));
}
ServiceImpl :
#Override
public CarePost save(CarePost cp) {
return carePostRepository.save(cp) ; }
You should not use #NotBlank because:
#NotBlank can be applied only to text values and validates that the property is not null or whitespace.
instead you should use #NotEmpty because :
#NotEmpty validates that the property is not null or empty; can be applied to String, Collection, Map or Array values.
If your goal is to verify if in the array are some empty values, you can create a custom validator like shown here or you can use List<#NotBlank String> preferences;
Putting #NotBlank (or others validation annotation) at property level, will validate the property itself, if you wanna instead validate the element of a the property (in this case a collection)you have to put the annotation at the same element level (as shown on the top).
Source: Javax Validation

Mongotemplate: How to convert result field to custom Java Type?

// collection: test
{
...
Datetime: 43665.384931
...
}
public Class POJO {
#Field("ID")
private String id;
#Field("Datetime")
private Date datetime; // Where can I implement a converter to cast double value from mongo to Java type Date here?
}
mongoTemplate.findOne(new Query(), POJO.class, "test")
Where can I implement a converter to cast double value from mongo to Java type Date here?
You may want to give #Field(targetType = FieldType.INT64) of the upcoming Spring Data MongoDB 2.2 release a try. It allows to pass on type information to the conversion subsystem using the ConversionService to perform the required transformations.
class Pojo {
String id;
#Field(targetType = FieldType.INT64) Date date;
}
At the time of writing there are only converters registered for a Date -> String conversion, but none for Date -> Long, so you'll need to register the converter as well.
((GenericConversionService) mongoTemplate.getConverter().getConversionService())
.addConverter(new Converter<Date, Long>() {
#Override
public Long convert(Date source) {
return source.getTime();
}
});
Registering a MongoCustomConversions bean works for me.

Spring boot - date in response is not well formatted

I have the following entity column definition:
#Column(name= "time")
#Temporal(TemporalType.TIMESTAMP)
private java.util.Calendar time;
When I query and return my data as JSON:
modules = this.moduleStatsRepository.findAll();
JsonArray modulesArray = Application.gson.fromJson(Application.gson.toJson(modules), JsonArray.class);
JsonObject modulesJson = new JsonObject();
modulesJson.add("modules", modulesArray);
modulesJson.addProperty("triggerTimeShortSec", configurationManager.startupConfig.get("stats_trigger_time_sec"));
modulesJson.addProperty("triggerTimeLongSec", Integer.parseInt(configurationManager.startupConfig.get("stats_trigger_time_sec")) * 3);
return Application.gson.toJson(modulesJson);
the time is returned as an object, not really ideal:
Is there any way to customize gson settings to parse dates as ISO 8601?
Many of these things come out of the box with Jackson. With Gson it doesn't seem that there is an option to configure ISO 8601 timestamps, so you'll have to write it yourself by registering a JsonSerializer<Calendar> and perhaps also a JsonDeserializer<Calendar>.
For example, a simplified ISO 8601 string to calendar converter could look like this:
public class CalendarISO8601Serializer implements JsonSerializer<Calendar>, JsonDeserializer<Calendar> {
private static final SimpleDateFormat FORMATTER = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss'Z'");
#Override
public Calendar deserialize(JsonElement jsonElement, Type type, JsonDeserializationContext jsonDeserializationContext) throws JsonParseException {
try {
Calendar instance = Calendar.getInstance();
instance.setTime(FORMATTER.parse(jsonElement.getAsString()));
return instance;
} catch (ParseException e) {
throw new JsonParseException(e);
}
}
#Override
public JsonElement serialize(Calendar calendar, Type type, JsonSerializationContext jsonSerializationContext) {
return new JsonPrimitive(FORMATTER.format(calendar.getTime()));
}
}
This means you can no longer rely on the default Gson object created by Spring boot since I don't think it will automatically pick up the serializer as a type adapter. To solve this, you need to create your own Gson bean and add the serializer:
#Bean
public Gson gson() {
return new GsonBuilder()
.registerTypeHierarchyAdapter(Calendar.class, new CalendarISO8601Serializer())
.create();
}
But considering that you're using a public static field (Application.gson), you may want to see for yourself how you want to register that adapter.
just use java.util.Date as Type in your Entity:
#Temporal(TemporalType.TIMESTAMP)
private Date date;

Id field handling in Spring Data Mongo for child objects

I have been working in Spring Boot with the Spring Data MongoDB project and I am seeing behavior I am not clear on. I understand that the id field will go to _id in the Mongo repository per http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mapping.conventions.id-field. My problem is that it also seems to be happening for child entities which does not seem correct.
For example I have these classes (leaving out setters and getters for brevity) :
public class MessageBuild {
#Id
private String id;
private String name;
private TopLevelMessage.MessageType messageType;
private TopLevelMessage message;
}
public interface TopLevelMessage {
public enum MessageType {
MapData
}
}
public class MapData implements TopLevelMessage {
private String layerType;
private Vector<Intersection> intersections;
private Vector<RoadSegment> roadSegments;
}
public class RoadSegment {
private int id;
private String name;
private Double laneWidth;
}
and I create an object graph using this I use the appropriate MongoRepository class to save I end up with an example document like this (with _class left out):
{
"_id" : ObjectId("57c0c05568a6c4941830a626"),
"_class" : "com.etranssystems.coreobjects.persistable.MessageBuild",
"name" : "TestMessage",
"messageType" : "MapData",
"message" : {
"layerType" : "IntersectionData",
"roadSegments" : [
{
"_id" : 2001,
"name" : "Road Segment 1",
"laneWidth" : 3.3
}
]
}
}
In this case a child object with a field named id has its mapping converted to _id in the MongoDB repository. Not the end of the world although not expected. The biggest problem is now that this is exposed by REST MVC the _id fields are not returned from a query. I have tried to set the exposeIdsFor in my RepositoryRestConfigurerAdapter for this class and it exposes the id for the top level document but not the child ones.
So circling around the 2 questions/issues I have are:
Why are child object fields mapped to _id? My understanding is that this should only happen on the top level since things underneath are not really documents in their own right.
Shouldn't the configuration to expose id fields work for child objects in a document if it is mapping the field names?
Am I wrong to think that RoadSegment does not contain a getId() ? From Spring's documentation:
A property or field without an annotation but named id will be mapped
to the _id field.
I believe Spring Data does this even to nested classes, when it finds an id field. You may either add a getId(), so that the field is named id or annotate it with #Field:
public class RoadSegment {
#Field("id")
private int id;
private String name;
private Double laneWidth;
}
I agree this automatic conversion of id/_id should only be done at the top level in my opinion.
However, the way Spring Data Mongo conversion is coded, all java ojects go through the exact same code to be converted into json (both top and nested objects):
public class MappingMongoConverter {
...
protected void writeInternal(Object obj, final DBObject dbo, MongoPersistentEntity<?> entity) {
...
if (!dbo.containsField("_id") && null != idProperty) {
try {
Object id = accessor.getProperty(idProperty);
dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {}
}
...
if (!conversions.isSimpleType(propertyObj.getClass())) {
// The following line recursively calls writeInternal with the nested object
writePropertyInternal(propertyObj, dbo, prop);
} else {
writeSimpleInternal(propertyObj, dbo, prop);
}
}
writeInternal is called on the top level object, and then recalled recursively for each subobjects (aka SimpleTypes). So they both go through the same logic of adding _id.
Perhaps this is how we should read Spring's documentation:
Mongo's restrictions on Mongo Documents:
MongoDB requires that you have an _id field for all documents. If you
don’t provide one the driver will assign a ObjectId with a generated
value.
Spring Data's restrictions on java classes:
If no field or property specified above is present in the Java class
then an implicit _id file will be generated by the driver but not
mapped to a property or field of the Java class.

Java 8 Date Time api in JPA

What is the best way how to integrate Java 8 Date Time api in jpa?
I have added converters:
#Converter(autoApply = true)
public class LocalDatePersistenceConverter implements AttributeConverter<LocalDate, Date> {
#Override
public Date convertToDatabaseColumn(LocalDate localDate) {
return Date.valueOf(localDate);
}
#Override
public LocalDate convertToEntityAttribute(Date date) {
return date.toLocalDate();
}
}
and
#Converter(autoApply = true)
public class LocalDateTimePersistenceConverter implements AttributeConverter<LocalDateTime, Timestamp> {
#Override
public Timestamp convertToDatabaseColumn(LocalDateTime entityValue) {
return Timestamp.valueOf(entityValue);
}
#Override
public LocalDateTime convertToEntityAttribute(Timestamp databaseValue) {
return databaseValue.toLocalDateTime();
}
}
Everything seems fine, but how should I use JPQL for querying? I am using Spring JPARepository, and goal is to select all entities where date is the same as date given, only difference is that it is saved in entity as LocalDateTime.
So:
public class Entity {
private LocalDateTime dateTime;
...
}
And:
#Query("select case when (count(e) > 0) then true else false end from Entity e where e.dateTime = :date")
public boolean check(#Param("date") LocalDate date);
When executing it just gives me exception, which is correct.
Caused by: java.lang.IllegalArgumentException: Parameter value [2014-01-01] did not match expected type [java.time.LocalDateTime (n/a)]
I have tried many ways, but it seems that none is working, is that even possible?
Hibernate has an extension library, hibernate-java8 I believe, which natively supports many of the time types.
You should use it before writing converters.
in hibernate 5.2 you won't need this additional library, it is part of core.
To query temporal fields you should use the #Temporal Anotation in the temporal fields, add the converters to persistence.xml and also be sure you are using the java.sql.Date,java.sql.Time or java.sql.Timestamp in the converters. (Sometimes i imported from the wrong package)
for example thats works for me:
#Temporal(TemporalType.TIMESTAMP)
#Convert(converter = InstantPersistenceConverter.class)
private Instant StartInstant;
#Temporal(TemporalType.TIME)
#Convert(converter = LocalTimePersistenceConverter.class)
private LocalTime StartTime;
and my Instant converter:
#Converter(autoApply = true)
public class InstantPersistenceConverter implements AttributeConverter <Instant,java.sql.Timestamp>{
#Override
public java.sql.Timestamp convertToDatabaseColumn(Instant entityValue) {
return java.sql.Timestamp.from(entityValue);
}
#Override
public Instant convertToEntityAttribute(java.sql.Timestamp databaseValue) {
return databaseValue.toInstant();
}
}
Did you add LocalDatePersistenceConverter and LocalDateTimePersistenceConverter in persistence.xml placed in 'class' element ?

Resources