Apache camel with Spring DSL marshal to json issue - spring-boot

I have the following config
<dataFormats>
<json id="orderModel" library="Jackson" objectMapper="com.camel.CustomObjectMapper"
unmarshalTypeName="com.orders.OrderModel"/>
<json id="salesOrder" library="Jackson" objectMapper="com.camel.CustomObjectMapper"
unmarshalTypeName="com.camel.model.salesorder.SalesOrder"/>
</dataFormats>
<route id="orderTranslateToSalesOrder">
<from ref="orderPlaced"/>
<unmarshal ref="orderModel" />
<process ref="customerProcessor" />
<process ref="salesOrderConverter"/>
<marshal ref="salesOrder"/>
<inOnly ref="orderCreate" />
<process ref="history"/>
</route>
I read from orderPlaced which is a rabbit queue, then unmarshal the object to an OrderModel, then perform two process, where the second process changes the body object type from OrderModel to SalesOrder, but when marshaling the message I get an error
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "SalesOrder" (class com.orders.OrderModel), not marked as ignorable
at [Source: java.io.ByteArrayInputStream#4eac8add; line: 1, column: 16] (through reference chain: com.orders.OrderModel["SalesOrder"])
at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:62)
at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:834)
at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1093)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1489)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1467)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:282)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:140)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3814)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2924)
at org.apache.camel.component.jackson.JacksonDataFormat.unmarshal(JacksonDataFormat.java:185)
at org.apache.camel.processor.UnmarshalProcessor.process(UnmarshalProcessor.java:69)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:76)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:97)
at org.apache.camel.component.rabbitmq.RabbitConsumer.doHandleDelivery(RabbitConsumer.java:99)
at org.apache.camel.component.rabbitmq.RabbitConsumer.handleDelivery(RabbitConsumer.java:74)
at com.rabbitmq.client.impl.ConsumerDispatcher$5.run(ConsumerDispatcher.java:149)
at com.rabbitmq.client.impl.ConsumerWorkService$WorkPoolRunnable.run(ConsumerWorkService.java:100)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Suppressed: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "SalesOrder" (class com.orders.OrderModel), not marked as ignorable
at [Source: java.io.ByteArrayInputStream#4a931757; line: 1, column: 16] (through reference chain: com.orders.OrderModel["SalesOrder"])
... 25 more
Although I am specifying for the marshal to use the salesOrder data format, at the end when doing things for some reason it is using the orderModel data format, but I can not determine why.
This is what the SalesOrderConverter does at the end
exchange.getOut().setHeaders(exchange.getIn().getHeaders());
exchange.getOut().setBody(salesOrder, SalesOrder.class);

Unrecognized field "SalesOrder" because you haven't mapped this field in your pojo. if you don't want to map then also you should include in your pojo and have annotation #JsonIgnore

If your ExchangePattern is set to InOnly (which it looks like it is) the Out-part of your exchange will be discarded.
Change
exchange.getOut().setHeaders(exchange.getIn().getHeaders());
exchange.getOut().setBody(salesOrder, SalesOrder.class);
To this:
exchange.getIn().setBody(salesOrder, SalesOrder.class);
That way you won't have to copy your headers from the In-part either as they will already be there.
You can read more about how and when to use getIn/getOut here.

Related

Stream from MSK to RDS PostgreSQL with MSK Connector

I've been going in circles with this for a few days now. I'm sending data to Kafka using kafkajs. Each time I produce a message, I assign a UUID to the message.key value, and the the message.value is set to an event like this and then stringified:
// the producer is written in typescript
const event = {
eventtype: "event1",
eventversion: "1.0.1",
sourceurl: "https://some-url.com/source"
};
// stringified because the kafkajs producer only accepts `string` or `Buffer`
const stringifiedEvent = JSON.stringify(event);
I start my connect-standalone JDBC Sink Connector with the following configurations:
# connect-standalone.properties
name=local-jdbc-sink-connector
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
dialect.name=PostgreSqlDatabaseDialect
connection.url=jdbc:postgresql://postgres:5432/eventservice
connection.password=postgres
connection.user=postgres
auto.create=true
auto.evolve=true
topics=topic1
tasks.max=1
insert.mode=upsert
pk.mode=record_key
pk.fields=id
# worker.properties
offset.storage.file.filename=/tmp/connect.offsets
offset.flush.interval.ms=10000
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
value.converter.schema.registry.url=http://schema-registry:8081
key.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable=false
bootstrap.servers=localhost:9092
group.id=jdbc-sink-connector-worker
worker.id=jdbc-sink-worker-1
offset.storage.topic=connect-offsets
offset.storage.replication.factor=1
config.storage.topic=connect-configs
config.storage.replication.factor=1
status.storage.topic=connect-status
status.storage.replication.factor=1
When I start the connector with connect-standalone worker.properties connect-standalone.properties, it spins up and connects to PostgreSQL with no issue. However, when I produce an event, it fails with this error message:
WorkerSinkTask{id=local-jdbc-sink-connector-0} Task threw an uncaught and unrecoverable exception.
Task is being killed and will not recover until manually restarted. Error: Sink connector 'local-jdbc-sink-
connector' is configured with 'delete.enabled=false' and 'pk.mode=record_key' and therefore requires records
with a non-null Struct value and non-null Struct schema, but found record at (topic='topic1',partition=0,offset=0,timestamp=1676309784254) with a HashMap value and null value schema.
(org.apache.kafka.connect.runtime.WorkerSinkTask:609)
With this stack trace:
org.apache.kafka.connect.errors.ConnectException: Sink connector 'local-jdbc-sink-connector' is configured with
'delete.enabled=false' and 'pk.mode=record_key' and therefore requires records with a non-null Struct value and
non-null Struct schema, but found record at (topic='txningestion2',partition=0,offset=0,timestamp=1676309784254)
with a HashMap value and null value schema.
at io.confluent.connect.jdbc.sink.RecordValidator.lambda$requiresValue$2(RecordValidator.java:86)
at io.confluent.connect.jdbc.sink.RecordValidator.lambda$and$1(RecordValidator.java:41)
at io.confluent.connect.jdbc.sink.BufferedRecords.add(BufferedRecords.java:81)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:74)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:85)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:244)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
I've been going back and forth trying to get it to read my messages, but I'm not sure what is going wrong. One solution just leads to another error, and the solution for the new error leads back to the previous error. What is the correct configuration? How do I resolve this?

NiFi exception while redaing avro to json

In NiFi first I'm converting JSON to Avro and then from Avro to JSON. But while converting from Avro to JSON I'm getting exception.
while converting from Avro to JSON I'm getting the below exception:
2019-07-23 12:48:04,043 ERROR [Timer-Driven Process Thread-9] o.a.n.processors.avro.ConvertAvroToJSON ConvertAvroToJSON[id=1db0939d-016c-1000-caa3-80d0993c3468] ConvertAvroToJSON[id=1db0939d-016c-1000-caa3-80d0993c3468] failed to process session due to org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -40; Processor Administratively Yielded for 1 sec: org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -40
org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -40
at org.apache.avro.io.BinaryDecoder.doReadBytes(BinaryDecoder.java:336)
at org.apache.avro.io.BinaryDecoder.readString(BinaryDecoder.java:263)
at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:201)
at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:430)
at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:422)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:180)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)
at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:240)
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:230)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:174)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:144)
at org.apache.nifi.processors.avro.ConvertAvroToJSON$1.process(ConvertAvroToJSON.java:161)
at org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2887)
at org.apache.nifi.processors.avro.ConvertAvroToJSON.onTrigger(ConvertAvroToJSON.java:148)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:209)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Below is the template file:
https://community.hortonworks.com/storage/attachments/109978-avro-to-json-and-json-to-avro.xml
The flow that I have drawn is:
The input json is:
{
"name":"test",
"company":{
"exp":"1.5"
}
}
The converted avro data is:
Objavro.schema {"type":"record","name":"MyClass","namespace":"com.acme.avro","fields":[{"name":"name","type":"string"},{"name":"company","type":{"type":"record","name":"company","fields":[{"name":"exp","type":"string"}]}}]}avro.codecdeflate�s™ÍRól&D³DV`•ÔÃ6ã(I-.a3Ô3�s™ÍRól&D³DV`•ÔÃ6
For avro data file schema will be embedded and you want to write all
the fields in CSV format so we don't need to setup the registry. if
you are writing only specific columns not all and for other formats
JSON.. then schema registry is required - Shu
NIFI Malformed Data.Length is negative
More about internals

Could not read JSON: Can not deserialize instance of org.springframework.xd.rest.domain.JobExecutionInfoResource[] out of START_OBJECT token

Getting this error when running below code in eclispe
SpringXDTemplate xdTemplate = new SpringXDTemplate(new URI("http://my.ip:9393"));
List<JobExecutionInfoResource> listJobExecutions = xdTemplate.jobOperations().listJobExecutions();
i am using spring-xd version 1.3.1 and below is the stacktrace of the error:
Exception in thread "main" org.springframework.http.converter.HttpMessageNotReadableException: Could not read JSON: Can not deserialize instance of org.springframework.xd.rest.domain.JobExecutionInfoResource[] out of START_OBJECT token
at [Source: sun.net.www.protocol.http.HttpURLConnection$HttpInputStream#4478de9f; line: 1, column: 1]; nested exception is com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of org.springframework.xd.rest.domain.JobExecutionInfoResource[] out of START_OBJECT token
at [Source: sun.net.www.protocol.http.HttpURLConnection$HttpInputStream#4478de9f; line: 1, column: 1]
at org.springframework.http.converter.json.MappingJackson2HttpMessageConverter.readJavaType(MappingJackson2HttpMessageConverter.java:228)
at org.springframework.http.converter.json.MappingJackson2HttpMessageConverter.read(MappingJackson2HttpMessageConverter.java:220)
at org.springframework.web.client.HttpMessageConverterExtractor.extractData(HttpMessageConverterExtractor.java:95)
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:553)
at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:506)
at org.springframework.web.client.RestTemplate.getForObject(RestTemplate.java:243)
at org.springframework.xd.rest.client.impl.JobTemplate.listJobExecutions(JobTemplate.java:145)
at com.citiustech.hscale.dataIngestion.App.main(App.java:20)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of org.springframework.xd.rest.domain.JobExecutionInfoResource[] out of START_OBJECT token
at [Source: sun.net.www.protocol.http.HttpURLConnection$HttpInputStream#4478de9f; line: 1, column: 1]
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:164)
at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:691)
at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:685)
at com.fasterxml.jackson.databind.deser.std.ObjectArrayDeserializer.handleNonArray(ObjectArrayDeserializer.java:222)
at com.fasterxml.jackson.databind.deser.std.ObjectArrayDeserializer.deserialize(ObjectArrayDeserializer.java:133)
at com.fasterxml.jackson.databind.deser.std.ObjectArrayDeserializer.deserialize(ObjectArrayDeserializer.java:18)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2993)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2158)
at org.springframework.http.converter.json.MappingJackson2HttpMessageConverter.readJavaType(MappingJackson2HttpMessageConverter.java:225)
... 7 more
Actually it was the jar issue.In code i was using rest-client jar 1.0.4 and my spring xd version was 1.3.1.
Moreover latest rest-client jar(1.3.1) is not available in maven but the same is available in deployment folder of spring-xd.
https://mvnrepository.com/artifact/org.springframework.xd/spring-xd-rest-client

Coherence Exception in NamedCache.putall after updating coherence to 12.1.3

I use coherence in my project with following configuration.
Everything is Ok with coherence 12.1.2, but after updating coherence to version 12.1.3, i have a problem. When i put for example 1000 item on cache one by one using NamedCache.put() there is no error, but when i put 1000 items on cache with calling NamedCache.putall once, coherence raises an exception.
My project is in java and is deployed on jboss.
cache configuration:
<distributed-scheme>
<scheme-name>my-map</scheme-name>
<service-name>MyMap</service-name>
<serializer>
<instance>
<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
<init-params>
<init-param>
<param-type>String</param-type>
<param-value>pof-config.xml</param-value>
</init-param>
</init-params>
</instance>
</serializer>
<thread-count>100</thread-count>
<local-storage>true</local-storage>
<backup-count>0</backup-count>
<backing-map-scheme>
<read-write-backing-map-scheme>
<internal-cache-scheme>
<ramjournal-scheme/>
</internal-cache-scheme>
<cachestore-scheme>
<class-scheme>
<class-name>myPackage.loader</class-name>
</class-scheme>
</cachestore-scheme>
</read-write-backing-map-scheme>
</backing-map-scheme>
<autostart>true</autostart>
</distributed-scheme>
Raised exception details:
Exception in thread "pool-7-thread-1" com.tangosol.net.RequestIncompleteException: Partial failure
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BinaryMap.validatePartialResponse(PartitionedCache.CDB:56)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BinaryMap.putAll(PartitionedCache.CDB:123)
at com.oracle.common.collections.ConverterCollections$ConverterMap.putAll(ConverterCollections.java:1553)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.putAll(PartitionedCache.CDB:5)
at com.tangosol.coherence.component.util.SafeNamedCache.putAll(SafeNamedCache.CDB:1)
at mypackage.CacheEnabledDataProvider.putOnCache(CacheEnabledDataProvider.java:66)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: Portable(com.tangosol.util.WrapperException): (Wrapped: Failed request execution for myMap service on Member(Id=1, Timestamp=2014-10-20 18:00:25.737, Address=192.168.70.101:8088, MachineId=5642, Location=site:,machine:PS,process:9876,member:Mem_1, Role=CoherenceServer) (Wrapped: Failed to store keys="9112791815, 9165497418, 9199193873, 9192139970, 9392020128, ") Assertion failed:) Assertion failed:
at com.tangosol.util.Base.ensureRuntimeException(Base.java:289)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.tagException(Grid.CDB:50)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPartialCommit(PartitionedCache.CDB:7)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:85)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest$PutJob.run(PartitionedCache.CDB:1)
at com.tangosol.coherence.component.util.DaemonPool$WrapperTask.run(DaemonPool.CDB:1)
at com.tangosol.coherence.component.util.DaemonPool$WrapperTask.run(DaemonPool.CDB:32)
at com.tangosol.coherence.component.util.DaemonPool$Daemon.onNotify(DaemonPool.CDB:65)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:51)
at java.lang.Thread.run(Unknown Source)
at <process boundary>
at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:57)
at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3316)
at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:376)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
at com.tangosol.coherence.component.net.message.responseMessage.DistributedPartialResponse.read(DistributedPartialResponse.CDB:12)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PartialValueResponse.read(PartitionedCache.CDB:4)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:20)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:21)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:51)
at java.lang.Thread.run(Thread.java:745)
Caused by: Portable(com.tangosol.util.AssertionException): Assertion failed:
at com.tangosol.util.Base.azzertFailed(Base.java:209)
at com.tangosol.util.Base.azzert(Base.java:166)
at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.storeAllInternal(ReadWriteBackingMap.java:5943)
at com.tangosol.net.cache.ReadWriteBackingMap$StoreWrapper.storeAll(ReadWriteBackingMap.java:5067)
at com.tangosol.net.cache.ReadWriteBackingMap.putAll(ReadWriteBackingMap.java:840)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.putAllPrimaryResource(PartitionedCache.CDB:7)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.postPutAll(PartitionedCache.CDB:27)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.putAll(PartitionedCache.CDB:14)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:62)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest$PutJob.run(PartitionedCache.CDB:1)
at com.tangosol.coherence.component.util.DaemonPool$WrapperTask.run(DaemonPool.CDB:1)
at com.tangosol.coherence.component.util.DaemonPool$WrapperTask.run(DaemonPool.CDB:32)
at com.tangosol.coherence.component.util.DaemonPool$Daemon.onNotify(DaemonPool.CDB:65)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:51)
at java.lang.Thread.run(Unknown Source)
at <process boundary>
at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:57)
at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3316)
at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
at com.tangosol.io.pof.PortableException.readExternal(PortableException.java:150)
at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:59)
at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3316)
at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:376)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
at com.tangosol.coherence.component.net.message.responseMessage.DistributedPartialResponse.read(DistributedPartialResponse.CDB:12)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PartialValueResponse.read(PartitionedCache.CDB:4)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:20)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:21)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:51)
at java.lang.Thread.run(Thread.java:745)
I found that if loader class defined in cachestore-scheme element, implements CacheStore instead of CacheLoader and leave store() and storeAll() empty, no exception is raised. According to oracle documents, implementing CacheStore is not necessary for read-only cache-store and implementing CacheLoader is sufficient. This might be a bug.
I am not sure implementing CacheStore with empty Store() and storeAll() methods when we need read-only cache-store has performance penalty or not?
I would only implement CacheLoader (not CacheStore) if you are not writing back to the database.

org.springframework.oxm.UnmarshallingFailureException: JAXB unmarshalling exception unexpected element (uri:"", local:"html")

The other I spent 3 hours trying to fix the following error and I couldn't find the answer here, so now that I've found the answer I've decided to share it with you.
I'm working with Java 6, Spring MVC, JAXB and while I was creating a new webservice, I got this error:
Caused by: org.springframework.oxm.UnmarshallingFailureException: JAXB unmarshalling exception; nested exception is javax.xml.bind.UnmarshalException: unexpected element (uri:"", local:"html"). Expected elements are <undisclosed>
at org.springframework.oxm.jaxb.Jaxb2Marshaller.convertJaxbException(Jaxb2Marshaller.java:879)
at org.springframework.oxm.jaxb.Jaxb2Marshaller.unmarshal(Jaxb2Marshaller.java:755)
at org.springframework.oxm.jaxb.Jaxb2Marshaller.unmarshal(Jaxb2Marshaller.java:732)
... 29 more
Caused by: javax.xml.bind.UnmarshalException: unexpected element (uri:"", local:"html"). Expected elements are
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext.handleEvent(UnmarshallingContext.java:556)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.reportError(Loader.java:199)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.reportError(Loader.java:194)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.reportUnexpectedChildElement(Loader.java:71)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext$DefaultRootLoader.childElement(UnmarshallingContext.java:962)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext._startElement(UnmarshallingContext.java:399)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext.startElement(UnmarshallingContext.java:380)
at com.sun.xml.bind.v2.runtime.unmarshaller.SAXConnector.startElement(SAXConnector.java:101)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:501)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:400)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl$NSContentDriver.scanRootElementHook(XMLNSDocumentScannerImpl.java:626)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3103)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:922)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:140)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:511)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:808)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:119)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1205)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal0(UnmarshallerImpl.java:195)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal(UnmarshallerImpl.java:168)
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.unmarshal(AbstractUnmarshallerImpl.java:120)
at javax.xml.bind.helpers.AbstractUnmarshallerImpl.unmarshal(AbstractUnmarshallerImpl.java:103)
at org.springframework.oxm.jaxb.Jaxb2Marshaller.unmarshal(Jaxb2Marshaller.java:751)
... 31 more
The error was i forgot to add #ResponseBody in the response type, in the signature of the method

Resources