Getting error mapper parsing exception while indexing - elasticsearch

I am new to elasticsearch. I was trying to index the attachment but getting error.
i had executed following step.
Installed the Mapper-attachment plugin
i had converted text file to base64 with openssl command
openssl enc -base64 -in test3.txt -out t3.file
after that i had created mapping
[root#n1 testcase]# curl -XPUT 'http://localhost:9200/indextryes/?pretty=1' -d '
{ "mappings" : { "doc" : { "properties" : {"file" : {"type" : "attachment"}}}}}'
{
"ok" : true,
"acknowledged" : true
}
when i try to index it i got following error message
[root#n1 testcase]# curl -X POST "localhost:9200/indextryes/text" -d #t3.file
{"error":"MapperParsingException[failed to parse]; nested: ElasticSearchParseException[Failed to derive xcontent from (offset=0, length=64): [98, 71, 86, 48, 99, 121, 66, 107, 98, 121, 66, 104, 98, 109, 57, 48, 97, 71, 86, 121, 73, 72, 82, 108, 99, 51, 81, 103, 89, 87, 53, 107, 73, 72, 90, 108, 99, 109, 108, 109, 101, 83, 66, 108, 98, 71, 70, 122, 100, 71, 108, 106, 99, 50, 86, 104, 99, 109, 78, 111, 76, 103, 61, 61]]; ","status":400}
thanks for help...

Related

Spring Cloud Data Flow urlExpression of httpclient cannot be set properly

I parse image urls from JSON produced by Twitter with Spring Cloud Data Flow and I'd like to download the image with httpclient.
Here is the pipeline:
twitterstream --twitter.credentials.consumerKey=*** --twitter.credentials.consumerSecret=*** --twitter.credentials.accessToken=*** --twitter.credentials.accessTokenSecret=*** | splitter --expression=#jsonPath(payload,'$.entities.media[*].media_url') | httpclient --httpclient.httpMethod=GET --httpclient.urlExpression=payload | log
If I exclude the httpclient the following log appears in the log, therefore I suppose the url extraction is successful and the httpclient gets the url.
2019-12-21 12:46:23.120 INFO 1 --- [container-0-C-1] log-sink : http://pbs.twimg.com/media/EMBcr-XVAAARY7x.png
I get the following exception from the httpclient (URI is not absolute)
2019-12-21 19:17:06.741 ERROR 1 --- [container-0-C-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: ConsumerRecord(topic = test.splitter, partition = 0, offset = 74, CreateTime = 1576954352460, serialized key size = -1, serialized value size = 87, headers = RecordHeaders(headers = [RecordHeader(key = sequenceNumber, value = [49]), RecordHeader(key = sequenceSize, value = [49]), RecordHeader(key = deliveryAttempt, value = [49]), RecordHeader(key = scst_nativeHeadersPresent, value = [116, 114, 117, 101]), RecordHeader(key = correlationId, value = [34, 56, 98, 57, 100, 56, 99, 57, 99, 45, 49, 100, 52, 50, 45, 55, 97, 101, 54, 45, 50, 54, 49, 55, 45, 50, 98, 102, 52, 101, 51, 100, 99, 98, 57, 55, 50, 34]), RecordHeader(key = contentType, value = [34, 97, 112, 112, 108, 105, 99, 97, 116, 105, 111, 110, 47, 106, 115, 111, 110, 34]), RecordHeader(key = spring_json_header_types, value = [123, 34, 115, 101, 113, 117, 101, 110, 99, 101, 78, 117, 109, 98, 101, 114, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 73, 110, 116, 101, 103, 101, 114, 34, 44, 34, 115, 99, 115, 116, 95, 110, 97, 116, 105, 118, 101, 72, 101, 97, 100, 101, 114, 115, 80, 114, 101, 115, 101, 110, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 66, 111, 111, 108, 101, 97, 110, 34, 44, 34, 115, 101, 113, 117, 101, 110, 99, 101, 83, 105, 122, 101, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 73, 110, 116, 101, 103, 101, 114, 34, 44, 34, 100, 101, 108, 105, 118, 101, 114, 121, 65, 116, 116, 101, 109, 112, 116, 34, 58, 34, 106, 97, 118, 97, 46, 117, 116, 105, 108, 46, 99, 111, 110, 99, 117, 114, 114, 101, 110, 116, 46, 97, 116, 111, 109, 105, 99, 46, 65, 116, 111, 109, 105, 99, 73, 110, 116, 101, 103, 101, 114, 34, 44, 34, 99, 111, 114, 114, 101, 108, 97, 116, 105, 111, 110, 73, 100, 34, 58, 34, 106, 97, 118, 97, 46, 117, 116, 105, 108, 46, 85, 85, 73, 68, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 84, 121, 112, 101, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 125])], isReadOnly = false), key = null, value = [B#73d60921)
org.springframework.integration.transformer.MessageTransformationException: Failed to transform Message; nested exception is org.springframework.messaging.MessageHandlingException: nested exception is java.lang.IllegalArgumentException: URI is not absolute, failedMessage=GenericMessage [payload=byte[87], headers={sequenceNumber=1, sequenceSize=1, deliveryAttempt=3, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedTopic=test.splitter, kafka_offset=74, scst_nativeHeadersPresent=true, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#103e165f, correlationId=8b9d8c9c-1d42-7ae6-2617-2bf4e3dcb972, kafka_receivedPartitionId=0, contentType=application/json, kafka_receivedTimestamp=1576954352460}]
at org.springframework.integration.transformer.MessageTransformingHandler.handleRequestMessage(MessageTransformingHandler.java:115) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:123) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:169) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:115) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:132) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:105) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:73) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:453) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:401) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:187) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:166) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:109) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:205) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.sendMessageIfAny(KafkaMessageDrivenChannelAdapter.java:369) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$400(KafkaMessageDrivenChannelAdapter.java:74) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:431) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:402) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.lambda$onMessage$0(RetryingMessageListenerAdapter.java:120) ~[spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:287) ~[spring-retry-1.2.4.RELEASE.jar!/:na]
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:211) ~[spring-retry-1.2.4.RELEASE.jar!/:na]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:114) ~[spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:40) ~[spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:1278) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:1261) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1222) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1203) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:1123) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:938) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:751) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:700) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_192]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_192]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_192]
Caused by: org.springframework.messaging.MessageHandlingException: nested exception is java.lang.IllegalArgumentException: URI is not absolute
at org.springframework.integration.handler.LambdaMessageProcessor.processMessage(LambdaMessageProcessor.java:111) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.transformer.AbstractMessageProcessingTransformer.transform(AbstractMessageProcessingTransformer.java:113) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.transformer.MessageTransformingHandler.handleRequestMessage(MessageTransformingHandler.java:109) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
... 33 common frames omitted
Caused by: java.lang.IllegalArgumentException: URI is not absolute
at java.net.URI.toURL(URI.java:1088) ~[na:1.8.0_192]
at org.springframework.http.client.SimpleClientHttpRequestFactory.createRequest(SimpleClientHttpRequestFactory.java:145) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.http.client.support.HttpAccessor.createRequest(HttpAccessor.java:87) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:731) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:637) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.cloud.stream.app.httpclient.processor.HttpclientProcessorFunctionConfiguration.lambda$httpRequest$0(HttpclientProcessorFunctionConfiguration.java:102) ~[spring-cloud-starter-stream-processor-httpclient-2.1.2.RELEASE.jar!/:2.1.2.RELEASE]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_192]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_192]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_192]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_192]
at org.springframework.integration.handler.LambdaMessageProcessor.processMessage(LambdaMessageProcessor.java:102) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
... 35 common frames omitted
I think the problem is with the --httpclient.urlExpression=payload parameter, where I try to reference the url by the payload keyword. What's the problem exactly?
UPDATE:
I think the problem is that the payload is a series of ASCII character codes. How can I create a string from that?
The problem is that the payload is a series of ASCII codes and I solved the issue by setting --httpclient.urlExpression='new String(payload)'. It converts the ASCII codes to a string, but I think it's not the best solution so I'm waiting for a better one.

springboot-kafka java 8 time serialization

Currently working with spring-boot 2.0.4 with spring-kafka 2.1.8.RELEASE.
I've wanted to simplify the interchange a bit sending objects to kafka template and used json as format. Some of the messages that needs to be deserialized however contains java.time.LocalDateTime. So my setup is
Config (application.yml):
spring:
jackson:
serialization:
write_dates_as_timestamps: false
kafka:
consumer:
group-id: foo
enable-auto-commit: true
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties:
spring.json.trusted.packages: my.package
producer:
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
key-serializer: org.apache.kafka.common.serialization.StringSerializer
properties:
spring.json.trusted.packages: my.package
retries: 3
acks: all
as for the jackson dependencies which is supposed to be needed for it to work, my dependency tree is:
[INFO] | | +- com.fasterxml.jackson.core:jackson-databind:jar:2.9.6:compile
[INFO] | | | +- com.fasterxml.jackson.core:jackson-annotations:jar:2.9.0:compile
[INFO] | | | \- com.fasterxml.jackson.core:jackson-core:jar:2.9.6:compile
[INFO] | | \- com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar:2.9.6:compile
[INFO] | | +- com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar:2.9.6:compile
[INFO] | | \- com.fasterxml.jackson.module:jackson-module-parameter-names:jar:2.9.6:compile
This however produces the following error:
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition Foo-0 at offset 4. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data [[123, 34, 105, 100, 34, 58, 34, 97, 50, 99, 50, 56, 99, 99, 101, 97, 49, 98, 98, 52, 51, 97, 97, 56, 53, 50, 49, 53, 99, 101, 49, 54, 57, 48, 52, 51, 51, 98, 51, 45, 50, 34, 44, 34, 97, 117, 116, 104, 111, 114, 34, 58, 34, 97, 110, 116, 111, 110, 105, 111, 34, 44, 34, 99, 114, 101, 97, 116, 101, 100, 34, 58, 123, 34, 104, 111, 117, 114, 34, 58, 49, 56, 44, 34, 109, 105, 110, 117, 116, 101, 34, 58, 52, 48, 44, 34, 115, 101, 99, 111, 110, 100, 34, 58, 53, 49, 44, 34, 110, 97, 110, 111, 34, 58, 51, 50, 53, 48, 48, 48, 48, 48, 48, 44, 34, 100, 97, 121, 79, 102, 89, 101, 97, 114, 34, 58, 50, 52, 48, 44, 34, 100, 97, 121, 79, 102, 87, 101, 101, 107, 34, 58, 34, 84, 85, 69, 83, 68, 65, 89, 34, 44, 34, 109, 111, 110, 116, 104, 34, 58, 34, 65, 85, 71, 85, 83, 84, 34, 44, 34, 100, 97, 121, 79, 102, 77, 111, 110, 116, 104, 34, 58, 50, 56, 44, 34, 121, 101, 97, 114, 34, 58, 50, 48, 49, 56, 44, 34, 109, 111, 110, 116, 104, 86, 97, 108, 117, 101, 34, 58, 56, 44, 34, 99, 104, 114, 111, 110, 111, 108, 111, 103, 121, 34, 58, 123, 34, 99, 97, 108, 101, 110, 100, 97, 114, 84, 121, 112, 101, 34, 58, 34, 105, 115, 111, 56, 54, 48, 49, 34, 44, 34, 105, 100, 34, 58, 34, 73, 83, 79, 34, 125, 125, 44, 34, 97, 103, 103, 114, 101, 103, 97, 116, 101, 73, 100, 34, 58, 34, 97, 50, 99, 50, 56, 99, 99, 101, 97, 49, 98, 98, 52, 51, 97, 97, 56, 53, 50, 49, 53, 99, 101, 49, 54, 57, 48, 52, 51, 51, 98, 51, 34, 44, 34, 118, 101, 114, 115, 105, 111, 110, 34, 58, 48, 44, 34, 112, 114, 105, 122, 101, 73, 110, 102, 111, 34, 58, 123, 34, 110, 117, 109, 98, 101, 114, 79, 102, 87, 105, 110, 110, 101, 114, 115, 34, 58, 49, 44, 34, 112, 114, 105, 122, 101, 80, 111, 111, 108, 34, 58, 49, 48, 44, 34, 112, 114, 105, 122, 101, 84, 97, 98, 108, 101, 34, 58, 91, 49, 48, 93, 125, 125]] from topic [Foo]
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Expected array or string.
at [Source: (byte[])"{"id":"a2c28ccea1bb43aa85215ce1690433b3-2","author":"foo","created":{"hour":18,"minute":40,"second":51,"nano":325000000,"dayOfYear":240,"dayOfWeek":"TUESDAY","month":"AUGUST","dayOfMonth":28,"year":2018,"monthValue":8,"chronology":{"calendarType":"iso8601","id":"ISO"}},"aggregateId":"a2c28ccea1bb43aa85215ce1690433b3","version":0,"prizeInfo":{"numberOfWinners":1,"prizePool":10,"prizeTable":[10]}}"; line: 1, column: 73] (through reference chain: my.package.Foo["created"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1342) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1138) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.datatype.jsr310.deser.JSR310DeserializerBase._handleUnexpectedToken(JSR310DeserializerBase.java:99) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:141) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:39) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:136) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:369) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1611) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1234) ~[jackson-databind-2.9.6.jar:2.9.6]
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:228) ~[spring-kafka-2.1.8.RELEASE.jar:2.1.8.RELEASE]
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:923) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.access$2600(Fetcher.java:93) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1100) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1200(Fetcher.java:949) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:570) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:531) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1154) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1111) ~[kafka-clients-1.0.2.jar:na]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:699) ~[spring-kafka-2.1.8.RELEASE.jar:2.1.8.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Due to this i have tried the following but non had worked so far:
1.Custom ObjectMapper declared as bean
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
return objectMapper;
}
2.Serializer annotation on LocalDateTime fields
To be sure that i have the correct object mapper settings and the necessary dependencies, i've created a rest controller to simulate the response as json as a rest endpoint returning an object with date time fields, this returns correctly; sample:
[
{
"playerId": "foo",
"points": 10,
"entryDateTime": "2018-08-19T09:30:20.051"
},
{
"playerId": "bar",
"points": 3,
"entryDateTime": "2018-08-27T09:30:20.051"
}
]
Using the Json(De)Serializer constructor with the object mapper param worked for me. I was having trouble (de)serializing a pojo that had an java.time.Instant field, so after hours of troubleshooting this same org.apache.kafka.common.errors.SerializationException***, I finally realized (with the help of answers such as those on here) that the issue is not spring, but kafka's own serialization. Given the objectmapper bean I had, I resolved by autowiring this into the JsonSerializer and JsonDeserializer parameters of my kafka producer and consumer set-ups.
#Configuration
public class JacksonConfig {
#Bean
#Primary
public ObjectMapper objectMapper(Jackson2ObjectMapperBuilder builder) {
ObjectMapper objectMapper = builder.build();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
return objectMapper;
}
}
#Configuration
public class KafkaProducerConfig {
#Value(value="${kafka.bootstrapAddress}")
private String bootstrapAddress;
#Autowired
private ObjectMapper objectMapper;
#Bean
public KafkaTemplate<String, Order> orderKafkaTemplate(){
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
ProducerFactory<String, Order> producerFactory = new DefaultKafkaProducerFactory<>(props, new StringSerializer(), new JsonSerializer<Order>(objectMapper));
return new KafkaTemplate<>(producerFactory);
}
}
#Configuration
public class KafkaConsumerConfig {
#Value(value="${kafka.bootstrapAddress}")
private String bootstrapAddress;
#Value(value="${kafka.consumer.groupId}")
private String groupId;
#Autowired
private ObjectMapper objectMapper;
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Order> orderKafkaListenerContainerFactory(){
ConcurrentKafkaListenerContainerFactory<String, Order> factory = new ConcurrentKafkaListenerContainerFactory<>();
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
ConsumerFactory<String, Order> consumerFactory = new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(Order.class, objectMapper));
factory.setConsumerFactory(consumerFactory);
return factory;
}
}
(Pojo shown for further clarity)
public class Order {
private long accountId;
private long assetId;
private long quantity;
private long price;
private Instant createdOn = Instant.now();
// no args constructor, constructor with params for all fields except createdOn, and getters/setters for all fields omitted
***often the cause was: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of 'java.time.Instant' (no Creators, like default construct, exist): cannot deserialize from object value (no delegate- or property-based Creator) at [Source: (byte[])"{"accountId":1,"assetId":2,"quantity":100,"price":1000,"createdOn":{"epochSecond":1558570217,"nano":728000000}}"...
When you set the serializers/deserializers using properties, Kafka instantiates them, not Spring. Kafka knows nothing about Spring or the customized ObjectMapper.
You need to override Boot's default producer/consumer factories and use the alternate constructors (or setters) to add the serializers/deserializers.
See the documentation.
Important
Only simple configuration can be performed with properties; for more advanced configuration (such as using a custom ObjectMapper in the serializer/deserializer), you should use the producer/consumer factory constructors that accept a pre-built serializer and deserializer. For example, with Spring Boot, to override the default factories:
#Bean
public ConsumerFactory<Foo, Bar> kafkaConsumerFactory(KafkaProperties properties,
JsonDeserializer customDeserializer) {
return new DefaultKafkaConsumerFactory<>(properties.buildConsumerProperties(),
customDeserializer, customDeserializer);
}
#Bean
public ProducererFactory<Foo, Bar> kafkaProducerFactory(KafkaProperties properties,
JsonSserializer customSerializer) {
return new DefaultKafkaConsumerFactory<>(properties.buildProducerProperties(),
customSerializer, customSerializer);
}
Setters are also provided, as an alternative to using these constructors.
You can extend Spring Kafka's JsonSerializer:
public class JsonSerializerWithJTM<T> extends JsonSerializer<T> {
public JsonSerializerWithJTM() {
super();
objectMapper.registerModule(new JavaTimeModule());
//whatever you want to configure here
}
}
Use this class in Kafka's configuration instead of the original one:
spring:
kafka:
consumer:
value-deserializer: com.foo.JsonSerializerWithJTM

Message conversion in SCDF on Kafka and NonTrustedHeaders

I am having a hard time figuring out how to get a simple SCDF pipeline functional.
I am using a local setup:
{
"versionInfo": {
"implementation": {
"name": "spring-cloud-dataflow-server-local",
"version": "1.6.0.BUILD-SNAPSHOT"
},
"core": {
"name": "Spring Cloud Data Flow Core",
"version": "1.6.0.BUILD-SNAPSHOT"
},
"dashboard": {
"name": "Spring Cloud Dataflow UI",
"version": "1.6.0.M1"
},
"shell": {
"name": "Spring Cloud Data Flow Shell",
"version": "1.6.0.BUILD-SNAPSHOT",
"url": "https://repo.spring.io/libs-snapshot/org/springframework/cloud/spring-cloud-dataflow-shell/1.6.0.BUILD-SNAPSHOT/spring-cloud-dataflow-shell-1.6.0.BUILD-SNAPSHOT.jar"
}
},
"featureInfo": {
"analyticsEnabled": true,
"streamsEnabled": true,
"tasksEnabled": true,
"skipperEnabled": false
},
"securityInfo": {
"isAuthenticationEnabled": false,
"isAuthorizationEnabled": false,
"isFormLogin": false,
"isAuthenticated": false,
"username": null,
"roles": []
},
"runtimeEnvironment": {
"appDeployer": {
"platformSpecificInfo": {},
"deployerImplementationVersion": "1.3.7.RELEASE",
"deployerName": "LocalAppDeployer",
"deployerSpiVersion": "1.3.2.RELEASE",
"javaVersion": "1.8.0_45",
"platformApiVersion": "Mac OS X 10.13.4",
"platformClientVersion": "10.13.4",
"platformHostVersion": "10.13.4",
"platformType": "Local",
"springBootVersion": "1.5.14.RELEASE",
"springVersion": "4.3.18.RELEASE"
},
"taskLauncher": {
"platformSpecificInfo": {},
"deployerImplementationVersion": "1.3.7.RELEASE",
"deployerName": "LocalTaskLauncher",
"deployerSpiVersion": "1.3.2.RELEASE",
"javaVersion": "1.8.0_45",
"platformApiVersion": "Mac OS X 10.13.4",
"platformClientVersion": "10.13.4",
"platformHostVersion": "10.13.4",
"platformType": "Local",
"springBootVersion": "1.5.14.RELEASE",
"springVersion": "4.3.18.RELEASE"
}
}
}
The pipeline is pretty simple:
http --port=9191 | transform --expression=payload.toUpperCase() | log
When I trigger the http endpoint with cURL like this:
curl -v -H"Referer: http://localhost:8080" -H"Content-Type: text/plain" -XPOST localhost:9191/ -d 'test'
I see the following error message in the logfile of the transform processor:
2018-07-11 09:56:59.758 ERROR 66396 --- [container-0-C-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: ConsumerRecord(topic = edded.http, partition = 0, offset = 0, CreateTime = 1531295816669, serialized key size = -1, serialized value size = 17, headers = RecordHeaders(headers = [RecordHeader(key = referer, value = [34, 104, 116, 116, 112, 58, 47, 47, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 56, 48, 56, 48, 34]), RecordHeader(key = content-length, value = [49, 55]), RecordHeader(key = http_requestMethod, value = [34, 80, 79, 83, 84, 34]), RecordHeader(key = host, value = [34, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 57, 49, 57, 49, 34]), RecordHeader(key = http_requestUrl, value = [34, 104, 116, 116, 112, 58, 47, 47, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 57, 49, 57, 49, 47, 34]), RecordHeader(key = contentType, value = [123, 34, 116, 121, 112, 101, 34, 58, 34, 116, 101, 120, 116, 34, 44, 34, 115, 117, 98, 116, 121, 112, 101, 34, 58, 34, 112, 108, 97, 105, 110, 34, 44, 34, 112, 97, 114, 97, 109, 101, 116, 101, 114, 115, 34, 58, 123, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 34, 85, 84, 70, 45, 56, 34, 125, 44, 34, 113, 117, 97, 108, 105, 116, 121, 86, 97, 108, 117, 101, 34, 58, 49, 46, 48, 44, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 34, 85, 84, 70, 45, 56, 34, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 84, 121, 112, 101, 34, 58, 102, 97, 108, 115, 101, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 83, 117, 98, 116, 121, 112, 101, 34, 58, 102, 97, 108, 115, 101, 44, 34, 99, 111, 110, 99, 114, 101, 116, 101, 34, 58, 116, 114, 117, 101, 125]), RecordHeader(key = user-agent, value = [34, 77, 111, 122, 105, 108, 108, 97, 47, 53, 46, 48, 32, 40, 99, 111, 109, 112, 97, 116, 105, 98, 108, 101, 59, 32, 77, 83, 73, 69, 32, 57, 46, 48, 59, 32, 87, 105, 110, 100, 111, 119, 115, 32, 78, 84, 32, 54, 46, 49, 59, 32, 84, 114, 105, 100, 101, 110, 116, 47, 53, 46, 48, 41, 34]), RecordHeader(key = accept, value = [123, 34, 116, 121, 112, 101, 34, 58, 34, 42, 34, 44, 34, 115, 117, 98, 116, 121, 112, 101, 34, 58, 34, 42, 34, 44, 34, 112, 97, 114, 97, 109, 101, 116, 101, 114, 115, 34, 58, 123, 125, 44, 34, 113, 117, 97, 108, 105, 116, 121, 86, 97, 108, 117, 101, 34, 58, 49, 46, 48, 44, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 110, 117, 108, 108, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 84, 121, 112, 101, 34, 58, 116, 114, 117, 101, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 83, 117, 98, 116, 121, 112, 101, 34, 58, 116, 114, 117, 101, 44, 34, 99, 111, 110, 99, 114, 101, 116, 101, 34, 58, 102, 97, 108, 115, 101, 125]), RecordHeader(key = spring_json_header_types, value = [123, 34, 114, 101, 102, 101, 114, 101, 114, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 45, 108, 101, 110, 103, 116, 104, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 76, 111, 110, 103, 34, 44, 34, 104, 116, 116, 112, 95, 114, 101, 113, 117, 101, 115, 116, 77, 101, 116, 104, 111, 100, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 104, 111, 115, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 104, 116, 116, 112, 95, 114, 101, 113, 117, 101, 115, 116, 85, 114, 108, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 84, 121, 112, 101, 34, 58, 34, 111, 114, 103, 46, 115, 112, 114, 105, 110, 103, 102, 114, 97, 109, 101, 119, 111, 114, 107, 46, 104, 116, 116, 112, 46, 77, 101, 100, 105, 97, 84, 121, 112, 101, 34, 44, 34, 117, 115, 101, 114, 45, 97, 103, 101, 110, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 97, 99, 99, 101, 112, 116, 34, 58, 34, 111, 114, 103, 46, 115, 112, 114, 105, 110, 103, 102, 114, 97, 109, 101, 119, 111, 114, 107, 46, 104, 116, 116, 112, 46, 77, 101, 100, 105, 97, 84, 121, 112, 101, 34, 125])], isReadOnly = false), key = null, value = [B#4bc28689)
org.springframework.messaging.MessageHandlingException: nested exception is org.springframework.expression.spel.SpelEvaluationException: EL1004E: Method call: Method toUpperCase() cannot be found on type byte[]
at org.springframework.integration.handler.MethodInvokingMessageProcessor.processMessage(MethodInvokingMessageProcessor.java:107) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.handler.ServiceActivatingHandler.handleRequestMessage(ServiceActivatingHandler.java:93) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:109) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:158) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:116) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:132) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:105) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:73) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:445) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:394) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:181) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:160) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:108) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:203) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$300(KafkaMessageDrivenChannelAdapter.java:70) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:387) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:364) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.lambda$onMessage$0(RetryingMessageListenerAdapter.java:120) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter$$Lambda$659/1406308390.doWithRetry(Unknown Source) ~[na:na]
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:287) ~[spring-retry-1.2.2.RELEASE.jar!/:na]
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:211) ~[spring-retry-1.2.2.RELEASE.jar!/:na]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:114) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:40) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1071) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1051) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:998) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:866) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:724) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_45]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_45]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
Since i've provided the Content-Type header in HTTP request, and after reading this blog post, I assumed that during message conversion the payload of the message (I understand the default wire format for Kafka is byte[]) would then be converted to a String representation. However, the type of the Message.payload that TransformProcessorConfiguration.transform receives is still byte[].
Does this behavior have something to do with the fact that the Content-Type header appears as a NonTrustedHeaderType in the MessagingMessageConverter.toMessage() call? Stepping through with the debugger shows the following for the contentType header:
headerValue = {"type":"text","subtype":"plain","parameters":{"charset":"UTF-8"},"qualityValue":1.0,"charset":"UTF-8","wildcardType":false,"wildcardSubtype":false,"concrete":true}
untrustedType = "org.springframework.http.MediaType"
This is the list of rawHeaders that the MessagingMessageConverter resolves:
"referer"->"http://localhost:8080"
"content-length"->"17"
"http_requestMethod"->"POST"
"kafka_timestampType"->"CREATE_TIME"
"kafka_receivedMessageKey"->"null"
"kafka_receivedTopic"->"edded.http"
"accept"->"NonTrustedHeaderType
"kafka_offset"->"1"
"scst_nativeHeadersPresent"->"true"
"kafka_consumer"->
"host"->"localhost:9191"
"http_requestUrl"->"http://localhost:9191/"
"kafka_receivedPartitionId"->"0"
"contentType"->"NonTrustedHeaderType
"kafka_receivedTimestamp"->"1531296520235"
"user-agent"->"Mozilla/5.0
Another potentially related issue that I found is described here. However, I have no clue how to control mappers trustedPackages via binder properties, if that is at all related to my problem.
I also tried setting app.*.spring.cloud.stream.bindings.input.producer.headerMode=raw in the deployment properties, did not have any effect.
Thanks!
Actually the blog you pointed should not result in the assumption that there will be conversion based on the content-type header. Conversion is done only based on the type required by the handler and if such type is generic (i.e., Object) or byte[], no conversion will be performed. What is the signature of the TransformProcessorConfiguration.transform(..) method? Also, if you are attempting to do any kind of SPEL evaluation on the payload, you must assume that it is always a byte[] since conversion will happen only when a handler method is about to be invoked, so if you are using some expression in condition on the payload and assume String, don't.

how to play streaming mp3 messages (uint8 bytes) generated by ROS audio_capture in html WebAudio Api

I am trying to play streaming mp3 messages (uint8 bytes) generated by ROS audio_capture in html using WebAudio api.
The messages are generated by ROS audio_capture which publishes the mp3 as uint8 bytes. I am grabbing each message and trying to play it in the Web browser using web audio api.
Below is the sample mp3 message:
data: [255, 243, 152, 68, 193, 22, 185, 255, 114, 223, 97, 38, 124, 172, 27, 246, 218, 254, 194, 80, 189, 84, 180, 209, 56, 254, 114, 135, 214, 30, 151, 78, 75, 159, 104, 148, 93, 149, 152, 155, 66, 68, 77, 137, 197, 34, 132, 145, 8, 138, 163, 10, 159, 17, 160, 23, 78, 4, 43, 174, 73, 201, 40, 63, 17, 28, 209, 153, 70, 81, 180, 42, 180, 226, 199, 155, 79, 189, 189, 51, 135, 145, 183, 139, 206, 179, 81, 185, 55, 158, 158, 89, 168, 133, 27, 112, 101, 139, 143, 24, 73, 193, 5, 201, 195, 32, 81, 205, 100, 189, 224, 214, 122, 225, 61, 221, 74, 155, 134, 73, 229, 200, 24, 71, 83, 118, 86, 245, 166, 149, 53, 187, 113, 23, 17, 124, 43, 40, 232, 206, 60, 171, 168, 41, 158, 44, 112, 218, 54, 123, 29, 123, 163, 99, 79, 94, 174, 49, 242, 214, 50, 238, 153, 95, 57, 221, 223, 129, 247, 35, 157, 50, 77, 52, 69, 119, 110, 70, 44, 83, 84, 85, 141, 220, 190, 12, 85, 81, 24, 2, 122, 169, 36, 150, 238, 0, 232, 21, 230, 42, 96, 104, 226, 58, 78, 169, 244, 172, 228, 145, 76, 173, 170, 129, 62, 173, 50, 73, 91, 246, 17, 201, 6, 36, 101, 70, 69, 42, 146, 182, 124, 54, 11, 164, 78, 43, 44, 166, 188, 233, 96, 170, 171, 123, 66, 202, 107, 172, 152, 150, 143, 54, 68, 85, 70, 15, 3, 147, 69, 205, 25, 64, 210, 175, 185, 48, 165, 166, 94, 101, 85, 3, 36, 66, 99, 232, 153, 102, 196, 75, 152, 133, 79, 206, 202, 226, 158, 140, 158, 69, 221, 1, 70, 217, 193, 204, 138, 48, 138, 76, 127, 88, 211, 52, 173, 249, 85, 63, 201, 204, 210, 86, 42, 28, 135, 255, 187, 122, 220, 51, 27, 45, 238, 104, 73, 101, 165, 63, 190, 110, 102, 247, 66, 218, 251, 79, 131, 181, 44, 92, 167, 133, 160, 155, 228, 102, 119, 151, 159, 105, 85, 235, 145, 60, 252, 125, 120, 73, 211, 98]
Raw data received from audio source before base64 decoding mentioned below, let me know if its not a valid mp3 format.
//OYRM0VceN1L2EjfqzL/sJWwwydElS3xJYcJKpYaLXtfXvddaz6w5T+8w8/p6du6oWOCT6zoDp+zx9GnhKa8j0MGzMgZEs3bs5Z+x0InDcmNueXnI5J+eVwfp9emvuEZ2Oy2mNa7P3YYm+eK90yG6ktKsaInN/SpHvPlpxWo92+lBsUa9GHQagbEyziEbey/u8pi28oYYYW6B5mt0jZk1Ob7IFLsMIKSlOT3W2WXmcPS1Mhc0KIUiDkQ4IelkEbh6+OfQ/2llZ0karuqgLm5UZmFiYKwpE4wJMp0skT+cEmnaGUBIwvvpdXGyMCIEEEJAuQRTlJZtZoV0xGzMpG5tM+6ae6QKo5vTV2cmIBtiktmtOaqyQgxndQE/iCQQLCgpB8HVYUHBB0ldKDBBtGQKKkMKijjKVRoZIC/TYqVBS48IOBsoQsTiGpCygM3vbJNoZglxABBQerkwRDi7ipiISrDbitccaj
I am holding it in javascript array initially, then converting it to ArrayBuffer, then passing ArrayBuffer to decodeAudioData(), where i'm getting below error
Error in promise : Unable to decode Audio Data
Below is code:
function playByteArray(byteArray) {
var arrayBuffer = new ArrayBuffer(byteArray.length);
var bufferView = new Uint8Array(arrayBuffer);
for (i = 0; i < byteArray.length; i++) {
bufferView[i] = byteArray[i];
}
context.decodeAudioData(arrayBuffer, function(buffer) {
buf = buffer;
play();
});
}
// Play the loaded file
function play() {
// Create a source node from the buffer
var source = context.createBufferSource();
source.buffer = buf;
// Connect to the final output node (the speakers)
source.connect(context.destination);
// Play immediately
source.start(0);
}
Thanks in advance for help.
Your chunk of audio data probably isn't complete. In any case, you should consider using MediaSource Extensions for this task. That way, you can stream from the source and let the browser handle the rest.

Elasticsearch + Oracle JDBC River

Maybe this is a simple question but im new at ElasticSearch.. Installed es 1.4, Oracle 10g is up and running, jdbc plugin loaded in ES without issue. Marvel also working.. Tried to create a river with this statement in Marvel/Sense:
PUT _river/mydata/_meta
{
"type": "jdbc",
"jdbc": {
"driver": "oracle.jdbc.OracleDriver",
"url": “jdbc:oracle:thin:#host:1521:SID",
"user": “oracleusr",
"password": “ oraclepass",
"index": “myindex",
"type": “mytype",
"sql": "select * from aTable"
}
}
And i get this error all the time:
{
"error": "MapperParsingException[failed to parse]; nested: ElasticsearchParseException[Failed to derive xcontent from (offset=0, length=323): [80, 85, 84, 32, 95, 114, 105, 118, 101, 114, 47, 109, 52, 99, 47, 95, 109, 101, 116, 97, 32, 10, 123, 10, 32, 32, 34, 116, 121, 112, 101, 34, 58, 32, 34, 106, 100, 98, 99, 34, 44, 10, 32, 32, 34, 106, 100, 98, 99, 34, 58, 32, 123, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 100, 114, 105, 118, 101, 114, 34, 58, 32, 34, 111, 114, 97, 99, 108, 101, 46, 106, 100, 98, 99, 46, 79, 114, 97, 99, 108, 101, 68, 114, 105, 118, 101, 114, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 117, 114, 108, 34, 58, 32, -30, -128, -100, 106, 100, 98, 99, 58, 111, 114, 97, 99, 108, 101, 58, 116, 104, 105, 110, 58, 64, 49, 48, 46, 49, 57, 52, 46, 49, 55, 46, 49, 55, 51, 58, 49, 53, 50, 49, 58, 109, 52, 99, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 117, 115, 101, 114, 34, 58, 32, -30, -128, -100, 109, 52, 99, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 112, 97, 115, 115, 119, 111, 114, 100, 34, 58, 32, -30, -128, -100, 32, 109, 52, 99, 49, 50, 48, 57, 48, 53, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 105, 110, 100, 101, 120, 34, 58, 32, -30, -128, -100, 109, 52, 99, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 116, 121, 112, 101, 34, 58, 32, -30, -128, -100, 112, 97, 105, 115, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 115, 113, 108, 34, 58, 32, 34, 115, 101, 108, 101, 99, 116, 32, 42, 32, 102, 114, 111, 109, 32, 77, 52, 67, 80, 65, 73, 83, 34, 10, 32, 32, 32, 32, 32, 32, 125, 10, 125, 10]]; ",
"status": 400
}
on the CLI i get:
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:562)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:490)
at org.elasticsearch.index.shard.service.InternalIndexShard.prepareIndex(InternalIndexShard.java:413)
at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:189)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:511)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:419)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive xcontent from (offset=0, length=323)
I first though was a connection problem but checking around it seems its a problem with the mapping in ES & Oracle. Does someone have done ES+Oracle River integration? Any help would be really appreciated.
I found a solution that worked out really straigth forward. Instead of putting the command on the what i did is have a json config file like this:
{
"type" : "jdbc" ,
"jdbc" :{
"strategy" : "oneshot",
"driver" : "oracle.jdbc.OracleDriver",
"url" : "jdbc:oracle:thin:#host:1521:DNS",
"user" : "user",
"password" : "password",
"sql" : "select * from aTable",
"poll" : "1h",
"scale" : 0,
"autocommit" : false,
"fetchsize" : 100,
"max_rows" : 0,
"max_retries" : 3,
"max_retries_wait" : "10s"
},
"index" : {
"index" : "aIndex",
"type" : "aType",
"bulk_size" : 100
}
}
and then call it with curl like this:
'http://127.0.0.1:9200/_river/jdbcriver/_meta' -d #config.json
I don't know it's a parser issue or what but this way worked like a charm.
Thanks

Resources