Change datetime string format in request url when using Spring data rest with QuerydslPredicateExecutor - spring

In my domain class I have a field:
public class Reservation {
private LocalDateTime created = LocalDateTime.now();
In my repository I want to find only Reservations with some specific date (time doesn't matter):
public interface ReservationRepository extends Repository<Reservation, Long>, QuerydslPredicateExecutor<Reservation>, QuerydslBinderCustomizer<QReservation> {
bindings.bind(root.created).first((path, value) -> path.between(value.withMinute(0).withHour(0), value.withMinute(0).withHour(0).plusDays(1).minusSeconds(1)));
}
}
Now it works with this url:
/reservations?created=01/20/16 00:00 AM"
But I want to use this datatime format:
2016-01-20T00:00
As I understood the problem that Spring boot uses RepositoryRestMvcConfiguration.class for autoconfiguration. And by default TemporalAccessorParser.class uses some default DateTimeFormatter. And I want to change it to
DateTimeFormatter ISO_LOCAL_DATE_TIME

If only #DateTimeFormat annotation did not help, try to add to the project a custom Converter:
public class CustomStringToLocalDateTime implements Converter<String, LocalDateTime> {
#Override
public LocalDateTime convert(String source) {
return LocalDateTime.parse(source);
}
}
#Configuration
public class RepoRestConfig extends RepositoryRestConfigurerAdapter {
#Override
public void configureConversionService(ConfigurableConversionService conversionService) {
conversionService.addConverter(String.class, LocalDateTime.class, new CustomStringToLocalDateTime());
super.configureConversionService(conversionService);
}
}
Such an approach works in my project (except I had to convert string representation of date ('yyyy-MM-dd') to Instant (yyyy-MM-ddThh:mm:ssZ)).

Related

JSON field Desrializing to lowercase in Spring Boot

I have a Spring Boot Controller -
#RestController
public class UserController {
#PostMapping
#ResponseStatus(CREATED)
public UserResponse register( #Valid #RequestBody UserRequest userRequest) {
//return ....
}
}
Below is UserRequest.java
#Data
#NoArgsConstructor
#AllArgsConstructor
#Builder
public class UserRequest {
private String email;
//other property
}
I am sending below json in request body -
{
"email" : "TEST#Example.com",
//some other fields.
}
Sometime client send email in uppercase or in camel case so in userRquest I want to change value of email field to lowercase like test#example.com while de serializing to UserRequest Object.
Is there any easy way to do this. Can I introduce my own annotation like #ToLowerCase how I can create my own annotation and use that at field level in UserRequest.
There is no easy way just by introducing a new annotation #ToLowerCase,
because then you would also need to implement some annotation processor
for doing the real conversion work.
But you can achieve your goal in a slightly different way.
In your UserRequest class annotate the email property
with #JsonDeserialize and specify a converter there.
#JsonDeserialize(converter = ToLowerCaseConverter.class)
private String email;
You need to implement the converter class by yourself,
but it is easy by extending it from StdConverter.
public class ToLowerCaseConverter extends StdConverter<String, String> {
#Override
public String convert(String value) {
return value.toLowerCase();
}
}
Jackson will use the setter methods in your class.
Perform the conversion to lower case in the setter.
For example
public void setEmail(String newValue)
{
email = StringUtils.lowerCase(newValue);
}
StringUtils is an apache commons class.
You can make a general StringDeserializer and register it in ObjectMapper as shown below:-
StringDeserializer class
public final class StringDeserializer extends StdDeserializer<String> {
public StringDeserializer() {
super((Class<String>) null);
}
#Override
public String deserialize(JsonParser parser, DeserializationContext context) throws IOException {
JsonToken token = parser.getCurrentToken();
if (token == JsonToken.VALUE_STRING) {
String text = parser.getText();
return text == null ? null : text.toLowerCase().trim();
}
return null;
}
}
JacksonConfiguration class
#Configuration
public class JacksonConfiguration {
#Autowired
void mapper(ObjectMapper mapper) {
mapper.registerModule(initModule());
}
private Module initModule() {
SimpleModule module = new SimpleModule();
module.addDeserializer(String.class, new StringDeserializer());
return module;
}
}
The above code makes jackson deserialize all strings to lowercase and trimmed.

Spring Data Rest and Oracle Between 2 dates query giving nothing

Spring Data rest is not able to fetch the data in between 2 dates from data base table.
Collection<XXXX> findByCreatedOnBetween(LocalDate fromDate, LocalDate todayDate);
From Bean
private LocalDate createdOn;
Collection<XXXX> findByCreatedOnBetween(LocalDate fromDate, LocalDate todayDate);
private LocalDate createdOn;
I want the data in between 2 dates:
SELECT
*
FROM
testing testing
WHERE
testing.created_on BETWEEN ? AND ? ;
I believe Spring Data Rest accepts only ISO 8601 date format by default (like 2018-10-22).
If you want to accept the date in different format you need to add a converter.
#Configuration
public class RepositoryRestConfig extends RepositoryRestConfigurerAdapter {
#Autowired
CustomDateConverter customDateConverter;
#Override
public void configureConversionService(ConfigurableConversionService conversionService) {
conversionService.addConverter(customDateConverter);
super.configureConversionService(conversionService);
}
}
#Component
public class CustomDateConverter implements Converter<String, LocalDate > {
#Override
public LocalDate convert(String source) {
return LocalDate.from(DateTimeFormatter.ofPattern("dd-MMM-yy").parse(source));
}
}

CodecConfigurationException when saving ZonedDateTime to MongoDB with Spring Boot >= 2.0.1.RELEASE

I was able to reproduce my problem with a minimal modification of the official Spring Boot guide for Accessing Data with MongoDB, see https://github.com/thokrae/spring-data-mongo-zoneddatetime.
After adding a java.time.ZonedDateTime field to the Customer class, running the example code from the guide fails with a CodecConfigurationException:
Customer.java:
public String lastName;
public ZonedDateTime created;
public Customer() {
output:
...
Caused by: org.bson.codecs.configuration.CodecConfigurationException`: Can't find a codec for class java.time.ZonedDateTime.
at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ChildCodecRegistry.get(ChildCodecRegistry.java:51) ~[bson-3.6.4.jar:na]
This can be solved by changing the Spring Boot version from 2.0.5.RELEASE to 2.0.1.RELEASE in the pom.xml:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.1.RELEASE</version>
</parent>
Now the exception is gone and the Customer objects including the ZonedDateTime fields are written to MongoDB.
I filed a bug (DATAMONGO-2106) with the spring-data-mongodb project but would understand if changing this behaviour is not wanted nor has a high priority.
What is the best workaround? When duckduckgoing for the exception message I find several approaches like registering a custom codec, a custom converter or using Jackson JSR 310. I would prefer to not add custom code to my project to handle a class from the java.time package.
Persisting date time types with time zones was never supported by Spring Data MongoDB, as stated by Oliver Drotbohm himself in DATAMONGO-2106.
These are the known workarounds:
Use a date time type without a time zone, e.g. java.time.Instant. (It is generally advisable to only use UTC in the backend, but I had to extend an existing code base which was following a different approach.)
Write a custom converter and register it by extending AbstractMongoConfiguration. See the branch converter in my test repository for a running example.
#Component
#WritingConverter
public class ZonedDateTimeToDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, Date.from(zonedDateTime.toInstant()));
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
#Component
#ReadingConverter
public class DocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
#Override
protected String getDatabaseName() {
return database;
}
#Bean
public CustomConversions customConversions() {
return new MongoCustomConversions(asList(
new ZonedDateTimeToDocumentConverter(),
new DocumentToZonedDateTimeConverter()
));
}
}
Write a custom codec. At least in theory. My codec test branch is unable to unmarshal the data when using Spring Boot 2.0.5 while working fine with Spring Boot 2.0.1.
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host + ":" + port, createOptions());
}
private MongoClientOptions createOptions() {
CodecProvider pojoCodecProvider = PojoCodecProvider.builder()
.automatic(true)
.build();
CodecRegistry registry = CodecRegistries.fromRegistries(
createCustomCodecRegistry(),
MongoClient.getDefaultCodecRegistry(),
CodecRegistries.fromProviders(pojoCodecProvider)
);
return MongoClientOptions.builder()
.codecRegistry(registry)
.build();
}
private CodecRegistry createCustomCodecRegistry() {
return CodecRegistries.fromCodecs(
new ZonedDateTimeCodec()
);
}
#Override
protected String getDatabaseName() {
return database;
}
}

Spring Data REST Custom Resource URI works for String but not Long

I have a model:
public class MyModel {
#Id private Long id;
private Long externalId;
// Getters, setters
}
I'd like to use externalId as my resource identifier:
#Configuration
static class RepositoryEntityLookupConfig extends RepositoryRestConfigurerAdapter {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration configuration) {
configuration
.withEntityLookup()
.forRepository(MyRepository.class, MyModel::getExternalId, MyRepository::findByExternalId);
}
}
If externalId is a String, this works fine. But since it's a number (Long)
public interface MyRepository extends JpaRepository<MyModel, Long> {
Optional<MyModel> findByExternalId(#Param("externalId") Long externalId);
}
when invoking: /myModels/1 I get:
java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
at org.springframework.data.rest.core.config.EntityLookupConfiguration$RepositoriesEntityLookup.lookupEntity(EntityLookupConfiguration.java:213) ~[spring-data-rest-core-2.6.4.RELEASE.jar:na]
at org.springframework.data.rest.core.support.UnwrappingRepositoryInvokerFactory$UnwrappingRepositoryInvoker.invokeFindOne(UnwrappingRepositoryInvokerFactory.java:130) ~[spring-data-rest-core-2.6.4.RELEASE.jar:na]
at org.springframework.data.rest.webmvc.RepositoryEntityController.getItemResource(RepositoryEntityController.java:524) ~[spring-data-rest-webmvc-2.6.4.RELEASE.jar:na]
at org.springframework.data.rest.webmvc.RepositoryEntityController.getItemResource(RepositoryEntityController.java:335) ~[spring-data-rest-webmvc-2.6.4.RELEASE.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_111]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_111]
...
A separate custom EntityLookupSupport<MyModel> component class works.
Am I missing something to get it working for Long using method references in my RepositoryRestConfigurerAdapter?
Try to add this to your RepositoryEntityLookupConfig class:
#Override
public void configureConversionService(ConfigurableConversionService conversionService) {
conversionService.addConverter(String.class, Long.class, Long::parseLong);
super.configureConversionService(conversionService);
}
Do you really need to set configuration by yourself ? You could try to use spring-boot auto-configuration by adding #RepositoryRestResource annotation
#RepositoryRestResource(collectionResourceRel = "myModels", path = "myModels")
public interface MyRepository extends JpaRepository<MyModel, Long> {
Optional<MyModel> findByExternalId(#Param("externalId") Long externalId);
}
Also add #Entity on your model class
#Entity
public class MyModel {
#Id
private Long id;
#Column(name = "EXTERNAL_ID")
// Column annotation is not required if you respect case-sensitive
private Long externalId;
// Getters, setters
}
Apparently, the default BackendIdConverter (see DefaultIdConverter) does nothing with ID conversion and on the other hand Spring Data Rest cannot use the repository ID type. So, you have to either convert it yourself or configure your custom ID converter bean, for example:
#Bean
public BackendIdConverter myModelBackendIdConverter() {
return new BackendIdConverter() {
#Override
public Serializable fromRequestId(final String id, final Class<?> entityType) {
return Optional.ofNullable(id).map(Long::parseLong).orElse(null);
}
#Override
public boolean supports(final Class<?> delimiter) {
return MyModel.class.isAssignableFrom(delimiter);
}
#Override
public String toRequestId(final Serializable id, final Class<?> entityType) {
return Optional.ofNullable(id).map(Object::toString).orElse(null);
}
};
}
See also:
BackendIdHandlerMethodArgumentResolver
#BackendId
The signature of the method you are trying to call seems to be:
forRepository(Class<R> type, Converter<T,ID> identifierMapping,
EntityLookupRegistrar.LookupRegistrar.Lookup<R,ID> lookup)
I don't see how MyModel::getExternalId can be doing the necessary conversion.
I would try something like the following:
#Configuration
static class RepositoryEntityLookupConfig extends RepositoryRestConfigurerAdapter {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration configuration) {
configuration
.withEntityLookup()
.forRepository(MyRepository.class, Long::parseLong, MyRepository::findByExternalId);
}
}

Spring Data Rest Repository with abstract class / inheritance

I can't get Spring Data Rest with class inheritance working.
I'd like to have a single JSON Endpoint which handles all my concrete classes.
Repo:
public interface AbstractFooRepo extends KeyValueRepository<AbstractFoo, String> {}
Abstract class:
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = MyFoo.class, name = "MY_FOO")
})
public abstract class AbstractFoo {
#Id public String id;
public String type;
}
Concrete class:
public class MyFoo extends AbstractFoo { }
Now when calling POST /abstractFoos with {"type":"MY_FOO"}, it tells me: java.lang.IllegalArgumentException: PersistentEntity must not be null!.
This seems to happen, because Spring doesn't know about MyFoo.
Is there some way to tell Spring Data REST about MyFoo without creating a Repository and a REST Endpoint for it?
(I'm using Spring Boot 1.5.1 and Spring Data REST 2.6.0)
EDIT:
Application.java:
#SpringBootApplication
#EnableMapRepositories
public class Application {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
I'm using Spring Boot 1.5.1 and Spring Data Release Ingalls.
KeyValueRepository doesn't work with inheritance. It uses the class name of every saved object to find the corresponding key-value-store. E.g. save(new Foo()) will place the saved object within the Foo collection. And abstractFoosRepo.findAll() will look within the AbstractFoo collection and won't find any Foo object.
Here's the working code using MongoRepository:
Application.java
Default Spring Boot Application Starter.
#SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
AbstractFoo.java
I've tested include = JsonTypeInfo.As.EXISTING_PROPERTY and include = JsonTypeInfo.As.PROPERTY. Both seem to work fine!
It's even possible to register the Jackson SubTypes with a custom JacksonModule.
IMPORTANT: #RestResource(path="abstractFoos") is highly recommended. Else the _links.self links will point to /foos and /bars instead of /abstractFoos.
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.EXISTING_PROPERTY, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = Foo.class, name = "MY_FOO"),
#JsonSubTypes.Type(value = Bar.class, name = "MY_Bar")
})
#Document(collection="foo_collection")
#RestResource(path="abstractFoos")
public abstract class AbstractFoo {
#Id public String id;
public abstract String getType();
}
AbstractFooRepo.java
Nothing special here
public interface AbstractFooRepo extends MongoRepository<AbstractFoo, String> { }
Foo.java & Bar.java
#Persistent
public class Foo extends AbstractFoo {
#Override
public String getType() {
return "MY_FOO";
}
}
#Persistent
public class Bar extends AbstractFoo {
#Override
public String getType() {
return "MY_BAR";
}
}
FooRelProvider.java
Without this part, the output of the objects would be separated in two arrays under _embedded.foos and _embedded.bars.
The supports method ensures that for all classes which extend AbstractFoo, the objects will be placed within _embedded.abstractFoos.
#Component
#Order(Ordered.HIGHEST_PRECEDENCE)
public class FooRelProvider extends EvoInflectorRelProvider {
#Override
public String getCollectionResourceRelFor(final Class<?> type) {
return super.getCollectionResourceRelFor(AbstractFoo.class);
}
#Override
public String getItemResourceRelFor(final Class<?> type) {
return super.getItemResourceRelFor(AbstractFoo.class);
}
#Override
public boolean supports(final Class<?> delimiter) {
return AbstractFoo.class.isAssignableFrom(delimiter);
}
}
EDIT
Added #Persistent to Foo.java and Bar.java. (Adding it to AbstractFoo.java doesn't work). Without this annotation I got NullPointerExceptions when trying to use JSR 303 Validation Annotations within inherited classes.
Example code to reproduce the error:
public class A {
#Id public String id;
#Valid public B b;
// #JsonTypeInfo + #JsonSubTypes
public static abstract class B {
#NotNull public String s;
}
// #Persistent <- Needed!
public static class B1 extends B { }
}
Please see the discussion in this resolved jira task for details of what is currently supported in spring-data-rest regarding JsonTypeInfo. And this jira task on what is still missing.
To summarize - only #JsonTypeInfo with include=JsonTypeInfo.As.EXISTING_PROPERTY is working for serialization and deserialization currently.
Also, you need spring-data-rest 2.5.3 (Hopper SR3) or later to get this limited support.
Please see my sample application - https://github.com/mduesterhoeft/spring-data-rest-entity-inheritance/tree/fixed-hopper-sr3-snapshot
With include=JsonTypeInfo.As.EXISTING_PROPERTY the type information is extracted from a regular property. An example helps getting the point of this way of adding type information:
The abstract class:
#Entity #Inheritance(strategy= SINGLE_TABLE)
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,
include=JsonTypeInfo.As.EXISTING_PROPERTY,
property="type")
#JsonSubTypes({
#Type(name="DECIMAL", value=DecimalValue.class),
#Type(name="STRING", value=StringValue.class)})
public abstract class Value {
#Id #GeneratedValue(strategy = IDENTITY)
#Getter
private Long id;
public abstract String getType();
}
And the subclass:
#Entity #DiscriminatorValue("D")
#Getter #Setter
public class DecimalValue extends Value {
#Column(name = "DECIMAL_VALUE")
private BigDecimal value;
public String getType() {
return "DECIMAL";
}
}

Resources