Elasticsearch/Spark serialization does not appear to play well with nested types.
For example:
public class Foo implements Serializable {
private List<Bar> bars = new ArrayList<Bar>();
// getters and setters
public static class Bar implements Serializable {
}
}
List<Foo> foos = new ArrayList<Foo>();
foos.add( new Foo());
// Note: Foo object does not contain nested Bar instances
SparkConf sc = new SparkConf(); //
sc.setMaster("local");
sc.setAppName("spark.app.name");
sc.set("spark.serializer", KryoSerializer.class.getName());
JavaSparkContext jsc = new JavaSparkContext(sc);
JavaRDD javaRDD = jsc.parallelize(ImmutableList.copyOf(foos));
JavaEsSpark.saveToEs(javaRDD, INDEX_NAME+"/"+TYPE_NAME);
The above code above works, and documents of type Foo will be indexed within Elasticsearch.
The issue arises when the bars list in a Foo object is not empty, for instance:
Foo = new Foo();
Bar = new Foo.Bar();
foo.getBars().add(bar);
Then, when indexing to Elasticsearch, the following exception is thrown:
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException:
Cannot handle type [Bar] within type [class Foo], instance [Bar ...]]
within instance [Foo#1cf628a]
using writer [org.elasticsearch.spark.serialization.ScalaValueWriter#4e635d]
at org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:63)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:71)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:58)
at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:148)
at org.elasticsearch.spark.rdd.EsRDDWriter.write(EsRDDWriter.scala:47)
at org.elasticsearch.spark.rdd.EsSpark$$anonfun$saveToEs$1.apply(EsSpark.scala:68)
at org.elasticsearch.spark.rdd.EsSpark$$anonfun$saveToEs$1.apply(EsSpark.scala:68)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
These are the relevant Maven dependencies
<dependency>
<groupId>com.sksamuel.elastic4s</groupId>
<artifactId>elastic4s_2.11</artifactId>
<version>1.5.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-hadoop-cascading</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark_2.10</artifactId>
<version>2.1.0.Beta4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
What is the correct way to index when using nested types with ElasticSearch and Spark?
Thanks
A solution could be to build a json from the object you're trying to save, using for example Json4s.
In this case your "JavaEsSpark" RDD would be a RDD of strings.
Then you simply have to call
JavaEsSpark.saveJsonToEs...
instead of
JavaEsSpark.saveToEs...
This workaround helped me save countless hours trying to figure out a way to Serialize nested maps.
Looking at the ScalaValueWriter & JdkValueWriter code we can see that only certain types are directly supported. Most likely the inner class is not a JavaBean or other supported type.
One day ScalaValueWriter & JdkValueWriter will possibly support user defined types (like Bar in our example), other than just Java types like String, int, etc.
In the meantime, there is the following workaround. Instead of having Foo expose a List of Bar objects, internally transform the List to a Map<String, Object> and expose that.
Something like this:
private List<Map<String, Object>> bars= new ArrayList<Map<String, Object>>();
public List<Map<String, Object>> getBars() {
return bars;
}
public void setBars(List<Bar> bars) {
for (Bar bar: bars){
this.bars.add(bar.getAsMap());
}
}
i suggest working with com.google.gson.Gson;
String foosJson = new Gson().toJson(foos );
then ,
Map map = new HashMap<> ();
...
...
JavaRDD<Map<String,?>> javaRDD= sc.parallelize(ImmutableList.of(map));
JavaEsSpark.saveToEs ( javaRDD, INDEX_NAME+"/"+TYPE_NAME );
Related
I'm developing a Spring Boot + Impala app using Apache Ignite as cache store.
The problem is IgniteRepository.save(key,entity) is only running UPDATE query instead of INSERT.
pom.xml
<ignite.version>2.14.0</ignite.version>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-spring-data-ext</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-core</artifactId>
<version>${ignite.version}</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-indexing</artifactId>
<version>${ignite.version}</version>
</dependency>
<dependency>
<groupId>org.apache.ignite</groupId>
<artifactId>ignite-spring</artifactId>
<version>${ignite.version}</version>
</dependency>
Ignite Configuration :
IgniteConfiguration cfg = new IgniteConfiguration();
cfg.setIgniteInstanceName("springDataNode");
cfg.setPeerClassLoadingEnabled(true);
CacheConfiguration ccfg = new CacheConfiguration("XYZCache");
ccfg.setIndexedTypes(Long.class, XYZ.class);
ccfg.setCacheMode(CacheMode.PARTITIONED);
ccfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);
ccfg.setReadThrough(true);
ccfg.setWriteThrough(true);
ccfg.setWriteBehindEnabled(true);
CacheJdbcPojoStoreFactory<Long, XYZ> factory = new CacheJdbcPojoStoreFactory<>();
factory.setDataSourceBean("ImpalaDataSource");
JdbcType jdbcType = new JdbcType();
jdbcType.setCacheName("XYZCache");
jdbcType.setKeyType(Long.class);
jdbcType.setValueType(XYZ.class);
jdbcType.setDatabaseTable("schema.table");
jdbcType.setKeyFields(new JdbcTypeField(Types.BIGINT, "id", Long.class, "id"));
jdbcType.setValueFields(
new JdbcTypeField(Types.VARCHAR, "comments", String.class, "comments"),
new JdbcTypeField(Types.BIGINT, "id", Long.class, "id")
);
factory.setTypes(jdbcType);
ccfg.setCacheStoreFactory(factory);
cfg.setCacheConfiguration(ccfg);
return IgniteSpring.start(cfg, applicationContext);
Ignite Repository :
#RepositoryConfig(cacheName = "XYZCache")
public interface XYZRepository extends IgniteRepository<XYZ, Long> {
#Query("select * FROM XYZ WHERE comments=?")
List<XYZ> test(String comments);
#Query("insert into XYZ (id,comments) values (?,?)")
List<XYZ> customSave(Long id,String comments);
}
POJO :
#Data
public class XYZ implements Serializable {
private static final long serialVersionUID = -2677636393779376050L;
#QuerySqlField
private Long id;
#QuerySqlField
private String comments;
}
Calling code:
xyzRepository.save(id, xyz);
xyzRepository.customSave(id, comments);
Both the methods are throwing error by running UPDATE query (instead of INSERT) which is not supported in Impala and also not what I intend to do :
Caused by:
org.apache.ignite.internal.processors.cache.CachePartialUpdateCheckedException:
Failed to update keys (retry update if possible).: [1671548234688] at
org.apache.ignite.internal.processors.cache.GridCacheUtils.convertToCacheException(GridCacheUtils.java:1251)
~[ignite-core-2.14.0.jar:2.14.0]
Caused by: org.apache.ignite.IgniteCheckedException: Failed update
entry in database [table=schema.table, entry=Entry [key=1671548234688,
val=pkg.XYZ [idHash=1354181174, hash=991365654, id=1671548234688,
comments=test]]] at
org.apache.ignite.internal.processors.cache.store.GridCacheStoreManagerAdapter.put(GridCacheStoreManagerAdapter.java:593)
at org.apache.ignite.internal.processors.cache.GridCacheMapEntry$AtomicCacheUpdateClosure.update(GridCacheMapEntry.java:6154)
at org.apache.ignite.internal.processors.cache.GridCacheMapEntry$AtomicCacheUpdateClosure.call(GridCacheMapEntry.java:5918)
at org.apache.ignite.internal.processors.cache.GridCacheMapEntry$AtomicCacheUpdateClosure.call(GridCacheMapEntry.java:5603)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree$Invoke.invokeClosure(BPlusTree.java:4254)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree$Invoke.access$5700(BPlusTree.java:4148)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree.invokeDown(BPlusTree.java:2226)
at org.apache.ignite.internal.processors.cache.persistence.tree.BPlusTree.invoke(BPlusTree.java:2116)
... 146 common frames omitted
Caused by: javax.cache.integration.CacheWriterException: Failed update entry in database [table=schema.table, entry=Entry
[key=1671548234688, val=pkg.XYZ [idHash=1354181174, hash=991365654,
id=1671548234688, comments=test]]]
at org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.writeUpsert(CacheAbstractJdbcStore.java:978)
at org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.write(CacheAbstractJdbcStore.java:1029)
at org.apache.ignite.internal.processors.cache.store.GridCacheStoreManagerAdapter.put(GridCacheStoreManagerAdapter.java:585)
... 153 common frames omitted
Caused by: com.cloudera.impala.support.exceptions.GeneralException:
[Cloudera]ImpalaJDBCDriver ERROR processing query/statement.
Error Code: 0, SQL state: TStatus(statusCode:ERROR_STATUS,
sqlState:HY000, errorMessage:AnalysisException: Impala does not
support modifying a non-Kudu table: schema.table ), Query: UPDATE
schema.table SET table.comments = 'test' WHERE (table.id =
1671548234688). ... 163 common frames omitted
What is the issue here? Why UPDATE is being forced by Apache Ignite? How can I change this behavior?
I also implemented Persistable interface and overrode isNew() to return true but it didn't work.
PS: Select queries are working fine (findAll, findById etc.) including the custom test() method. So, there is no datasource configuration issue and I am able to connect to Impala.
This is likely because the dialect you are using does not have the merge set up.
See here: to understand the flow. This is per the stack trace you posted.
org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.writeUpsert(CacheAbstractJdbcStore.java:978) at <br>
org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.write(CacheAbstractJdbcStore.java:1029) at <br>
org.apache.ignite.internal.processors.cache.store.GridCacheStoreManagerAdapter.put(GridCacheStoreManagerAdapter.java:585) ..
Alternatively, you can write your own data store factory.
My project is a simple spring boot application which doesn't have a main/#SpringBootApplication class. It is used as a dependency library for other modules. I am trying to write the unit tests for the classes present in this project like below and getting the below pasted error. Any quick help is much appreciated.
pom dependencies:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<!-- exclude junit 4 -->
<exclusions>
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- junit 5 -->
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<scope>test</scope>
</dependency>
As this project doesn't have main class, to get the spring application context using below configuration class.
#Configuration
public class TestServiceConfig {
#Bean
public TestService productService() {
return Mockito.mock(TestService.class);
}
#Bean
public MongoDriverService productMongo() {
return Mockito.mock(MongoDriverService.class);
}
}
Below is my test class which is throwing exception. Actual java class has a method called getPlanCode(which takes 6 arguments) and returns void. In this method mongo object is used for connecting the db so that I used #InjectMocks on service object.
public class ValidationServiceTest {
#Mock
MongoDriverService mongo;
#InjectMocks
TestService service;
#Test
#DisplayName("Test Get Plan Code positive")
public void getPlanCodeTest() {
doNothing().when(service).getPlanCode(anyString(), anyString(), any(Batch.class), any(BatchFile.class), any(Document.class), anyString());
service.getPlanCode(anyString(), anyString(), any(Batch.class), any(BatchFile.class), any(Document.class), anyString());
verify(service, times(1)).getPlanCode(anyString(), anyString(), any(Batch.class), any(BatchFile.class), any(Document.class), anyString());
}
}
Below is the exception
12:51:33.829 [main] DEBUG org.springframework.test.context.support.AbstractDirtiesContextTestExecutionListener - After test method: context [DefaultTestContext#45b4c3a9 testClass = DefaultMedicareBFTAccumsValidationServiceTest, testInstance = com.anthem.rxsmart.service.standalone.batchvalidation.DefaultMedicareBFTAccumsValidationServiceTest#14dda234, testMethod = getPlanCodeTest#DValidationServiceTest, testException = org.mockito.exceptions.misusing.NotAMockException:
Argument passed to when() is not a mock!
Example of correct stubbing:
service is not a mock since you are using #InjectMocks ( assume you are using #RunWith(MockitoRunner.class) or #ExtendWith but you are hiding that for whatever reasons).
What #InjectMocks does, is create of a new instance of TestService and literally inject mocks into it (mocked required dependencies). So service is a real thing, not a mock
IMO this test makes not sense as you are suppose to test your implementation of singular entity contract, not to test mocks...
Your test case and assertions are pointless as it is like "call method A and check if I just called method A" while you should check and validate eg return value of a call, or if some methods of mocks have been called eg if Mongo was queried with proper arguments. I just hope it is a really bad example, not real test scenario
Also test setup is wrong as you show us that you want to use #Configuration class with #Bean but then you are using #Mock in the test which will create brand new mocks for you. In other words - that config is not used at all
Posting this answer just for the developers who are in same understanding state.
#Test
#DisplayName("Test Get Plan Code positive")
public void getPlanCodeTest() {
service = new ValidationService(mongo);
Mockito.when(mongo.aggregateIterable("test", pipeline)).thenReturn(tierFilterDocs);
service.getPlanCode("", "", null, batchFile, null, "");
verify(mongo, times(1)).aggregateIterable("test", pipeline);
}
I have updated my test case so it solves the purpose now. Now no need of the Configuration file as I am mocking the object in test class itself.
My spring boot application wants to use Webclient to make an http request (XML request body) and receives XML response. Hence I created another spring boot application with jackson-dataformat-xml and created an endpoint to receive and return XML as below.
spring-boot-version=2.2.5
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
</dependency>
#PostMapping(value = "/api",
consumes = MediaType.APPLICATION_XML_VALUE,
produces = MediaType.APPLICATION_XML_VALUE)
public ResponseEntity<MyXmlResponse> trip(#RequestBody MyXmlRequest request) throws Exception {
MyXmlResponse response = new MyXmlResponse();
response.setStatus("SUCCESS");
response.setTripID(request.getTripID());
return ResponseEntity.ok().body(response);
}
It works perfect and obviously no JaxB annotations are required as I use jackson-dataformat-xml. Also the request XML can be case-insensitive.
Now, in my first application I want to consume this API via webclient. I read that Spring webflux do not support Jackson-dataformat-xml yet. Hence I have to annotate my classes with Jaxb annotations.
spring-boot-version=2.2.5
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
webClient.post()
.uri(URI.create("url-to-api-endpoint"))
.body(Mono.just(myXmlRequest), MyXmlRequest.class)
.exchange()
.doOnSuccess(response -> {
HttpStatus statusCode = response.statusCode();
log.info("Status code of external system request {}", statusCode);
})
.doOnError(onError -> {
log.error("Error on connecting to external system {}", onError.getMessage());
})
.flatMap(response -> response.bodyToMono(MyXmlResponse.class))
.subscribe(this::handleResponse);
Above code throws an exception as follows
org.springframework.webreactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlRequest
at org.springframework.web.reactive.function.BodyInserters.unsupportedError(BodyInserters.java:391)
I fixed this problem by annotating with XmlRootElement as follows
#Getter #Setter #NoArgsConstructor #ToString
#XmlRootElement()
public class MyXmlRequest {
private String attribute1;
}
On the next attempt I got another error as follows
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.web.reactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlResponse
Caused by: org.springframework.web.reactive.function.UnsupportedMediaTypeException: Content type 'application/xml' not supported for bodyType=com.example.MyXmlResponse
This could be solved by annotating MyXmlResponse with XmlRootElement as follows
#Getter #Setter #NoArgsConstructor #ToString
#XmlRootElement()
public class MyXmlResponse {
private String attr1;
private String attr2;
}
This time I get unmarshallexception as follows
reactor.core.Exceptions$ErrorCallbackNotImplemented: org.springframework.core.codec.DecodingException: Could not unmarshal XML to class com.example.MyXmlResponse; nested exception is javax.xml.bind.UnmarshalException
- with linked exception:
[com.sun.istack.internal.SAXParseException2; lineNumber: 1; columnNumber: 15; unexpected element (uri:"", local:"MyXmlResponse"). Expected elements are <{}myXmlResponse>]
Caused by: org.springframework.core.codec.DecodingException: Could not unmarshal XML to class com.example.MyXmlResponse; nested exception is javax.xml.bind.UnmarshalException
- with linked exception:
I fixed it with additional attributes passed to annotation as follows.
#XmlRootElement(name = "MyXmlResponse", namespace = "")
public class MyXmlResponse {
In future, my XML structures going to be tremendously complex. I want to know if I am doing it the right way.
I can't force Spring to use Kotlin module for Jackson.
The problem is that Jackson can't parse JSON into data class.
//Exception
2018-02-23 13:29:09.046 ERROR 24730 --- [nio-9300-exec-1] o.a.c.c.C.[.[.[.[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [/services] threw exception [Request processing failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [*.model.User]: Constructor threw exception; nested exception is java.lang.IllegalArgumentException: Parameter specified as non-null is null: method *.model.User.<init>, parameter name] with root cause
java.lang.IllegalArgumentException: Parameter specified as non-null is null: method *.User.<init>, parameter name
//JSON
{
"name": "name",
"surname": "surname",
"email": "email",
"password": "pswd"
}
//Model
#Entity
#Table
data class User(
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
var userId: Long?,
var name: String,
var surname: String,
var email: String,
var password: String,
(...)
): Resource() {
(...)
}
I tried to config Jackson, but it didn't help much. What's strange, inside #Bean method where I configure ObjectMapper, everything works fine.
Also, when I added default values for non-nullable fields, they weren't overwritten.
#Configuration
class JacksonConfig {
#Bean
fun mappingJackson2HttpMessageConverter(): MappingJackson2HttpMessageConverter {
val mapper = ObjectMapper().registerKotlinModule()
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false)
var user = mapper.readValue<User>("{\n" +
"\t\"name\": \"name\",\n" +
"\t\"surname\": \"surname\",\n" +
"\t\"email\": \"email\",\n" +
"\t\"password\": \"pswd\"\n" +
"}")
return MappingJackson2HttpMessageConverter(mapper)
}
}
What might be important is that model is in the different project than the application itself.
Kotlin version is 1.20
Jackson dependencies:
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version> <!--2.9.4-->
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-kotlin</artifactId>
<version>${jackson.version}</version>
</dependency>
And what I tried to this moment:
All of these answers
Similiar problem but probably different case anyway
And some other, not worth noticing.
I had structure like:
abstract class AbstractController (...) {
fun save(#RequestMapping entity: T) {
(...)
}
And to enable it I used
class DeriveredController (...) {
#PostMapping
override fun save(entity: Derivered) {
(...)
}
}
Problem was with lack of #RequestMapping in derivered function.
I've got strange problem and I hope you will to help me to solve it.
I try to pass list of objects, where each object contains LocalDate parameter (JodaTime library) from test service to my controller.
This is method from my service. It returns list of objects. Look at the dates printed out in the loop.
#RequestMapping("/getListaRecept")
#ResponseBody
public ListaRecept sendAnswer(){
ListaRecept listaReceptFiltered = prescriptionCreator.createListaRecept();
for(Recepta r : listaReceptFiltered.getListaRecept()){
System.out.println(r.toString());
}
return listaReceptFiltered;
}
Dates are correct
Recepta{id=3, nazwa='nurofen', status=NOT_REALIZED, date=2017-07-27}
Recepta{id=1, nazwa='ibuprom', status=ANNULED, date=2014-12-25}
Recepta{id=2, nazwa='apap', status=REALIZED, date=2016-08-18}
And now I'm invoking this method from my SpringBoot app using restTemplate. And then received list is printed out
private final RestTemplate restTemplate;
public SgrService2(RestTemplateBuilder restTemplateBuilder) {
this.restTemplate = restTemplateBuilder.build();
this.restTemplate.getMessageConverters()
.add(0, new StringHttpMessageConverter(Charset.forName("UTF-16")));
}
public ListaRecept getList() {
for(Recepta r : this.restTemplate.getForObject("http://localhost:8090/getListaRecept",
ListaRecept.class).getListaRecept()){
System.out.println(r.toString());
}
return this.restTemplate.getForObject("http://localhost:8090/getListaRecept",
ListaRecept.class);
}
As you can see all dates were replaced with current date :/
Recepta{id=3, nazwa='nurofen', status=NOT_REALIZED, date=2017-09-30}
Recepta{id=1, nazwa='ibuprom', status=ANNULED, date=2017-09-30}
Recepta{id=2, nazwa='apap', status=REALIZED, date=2017-09-30}
I have no idea what is going on...
Here you have pom dependencies
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.9</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.9.1</version>
</dependency>
Thank you in advance for your help
It seems to me that you are using the wrong jackson module, instead of jsr310 (which I guess is for Java 8 date types), try using the artifact jackson-datatype-joda and register the module JodaModule.