Message conversion in SCDF on Kafka and NonTrustedHeaders - spring

I am having a hard time figuring out how to get a simple SCDF pipeline functional.
I am using a local setup:
{
"versionInfo": {
"implementation": {
"name": "spring-cloud-dataflow-server-local",
"version": "1.6.0.BUILD-SNAPSHOT"
},
"core": {
"name": "Spring Cloud Data Flow Core",
"version": "1.6.0.BUILD-SNAPSHOT"
},
"dashboard": {
"name": "Spring Cloud Dataflow UI",
"version": "1.6.0.M1"
},
"shell": {
"name": "Spring Cloud Data Flow Shell",
"version": "1.6.0.BUILD-SNAPSHOT",
"url": "https://repo.spring.io/libs-snapshot/org/springframework/cloud/spring-cloud-dataflow-shell/1.6.0.BUILD-SNAPSHOT/spring-cloud-dataflow-shell-1.6.0.BUILD-SNAPSHOT.jar"
}
},
"featureInfo": {
"analyticsEnabled": true,
"streamsEnabled": true,
"tasksEnabled": true,
"skipperEnabled": false
},
"securityInfo": {
"isAuthenticationEnabled": false,
"isAuthorizationEnabled": false,
"isFormLogin": false,
"isAuthenticated": false,
"username": null,
"roles": []
},
"runtimeEnvironment": {
"appDeployer": {
"platformSpecificInfo": {},
"deployerImplementationVersion": "1.3.7.RELEASE",
"deployerName": "LocalAppDeployer",
"deployerSpiVersion": "1.3.2.RELEASE",
"javaVersion": "1.8.0_45",
"platformApiVersion": "Mac OS X 10.13.4",
"platformClientVersion": "10.13.4",
"platformHostVersion": "10.13.4",
"platformType": "Local",
"springBootVersion": "1.5.14.RELEASE",
"springVersion": "4.3.18.RELEASE"
},
"taskLauncher": {
"platformSpecificInfo": {},
"deployerImplementationVersion": "1.3.7.RELEASE",
"deployerName": "LocalTaskLauncher",
"deployerSpiVersion": "1.3.2.RELEASE",
"javaVersion": "1.8.0_45",
"platformApiVersion": "Mac OS X 10.13.4",
"platformClientVersion": "10.13.4",
"platformHostVersion": "10.13.4",
"platformType": "Local",
"springBootVersion": "1.5.14.RELEASE",
"springVersion": "4.3.18.RELEASE"
}
}
}
The pipeline is pretty simple:
http --port=9191 | transform --expression=payload.toUpperCase() | log
When I trigger the http endpoint with cURL like this:
curl -v -H"Referer: http://localhost:8080" -H"Content-Type: text/plain" -XPOST localhost:9191/ -d 'test'
I see the following error message in the logfile of the transform processor:
2018-07-11 09:56:59.758 ERROR 66396 --- [container-0-C-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: ConsumerRecord(topic = edded.http, partition = 0, offset = 0, CreateTime = 1531295816669, serialized key size = -1, serialized value size = 17, headers = RecordHeaders(headers = [RecordHeader(key = referer, value = [34, 104, 116, 116, 112, 58, 47, 47, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 56, 48, 56, 48, 34]), RecordHeader(key = content-length, value = [49, 55]), RecordHeader(key = http_requestMethod, value = [34, 80, 79, 83, 84, 34]), RecordHeader(key = host, value = [34, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 57, 49, 57, 49, 34]), RecordHeader(key = http_requestUrl, value = [34, 104, 116, 116, 112, 58, 47, 47, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 57, 49, 57, 49, 47, 34]), RecordHeader(key = contentType, value = [123, 34, 116, 121, 112, 101, 34, 58, 34, 116, 101, 120, 116, 34, 44, 34, 115, 117, 98, 116, 121, 112, 101, 34, 58, 34, 112, 108, 97, 105, 110, 34, 44, 34, 112, 97, 114, 97, 109, 101, 116, 101, 114, 115, 34, 58, 123, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 34, 85, 84, 70, 45, 56, 34, 125, 44, 34, 113, 117, 97, 108, 105, 116, 121, 86, 97, 108, 117, 101, 34, 58, 49, 46, 48, 44, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 34, 85, 84, 70, 45, 56, 34, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 84, 121, 112, 101, 34, 58, 102, 97, 108, 115, 101, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 83, 117, 98, 116, 121, 112, 101, 34, 58, 102, 97, 108, 115, 101, 44, 34, 99, 111, 110, 99, 114, 101, 116, 101, 34, 58, 116, 114, 117, 101, 125]), RecordHeader(key = user-agent, value = [34, 77, 111, 122, 105, 108, 108, 97, 47, 53, 46, 48, 32, 40, 99, 111, 109, 112, 97, 116, 105, 98, 108, 101, 59, 32, 77, 83, 73, 69, 32, 57, 46, 48, 59, 32, 87, 105, 110, 100, 111, 119, 115, 32, 78, 84, 32, 54, 46, 49, 59, 32, 84, 114, 105, 100, 101, 110, 116, 47, 53, 46, 48, 41, 34]), RecordHeader(key = accept, value = [123, 34, 116, 121, 112, 101, 34, 58, 34, 42, 34, 44, 34, 115, 117, 98, 116, 121, 112, 101, 34, 58, 34, 42, 34, 44, 34, 112, 97, 114, 97, 109, 101, 116, 101, 114, 115, 34, 58, 123, 125, 44, 34, 113, 117, 97, 108, 105, 116, 121, 86, 97, 108, 117, 101, 34, 58, 49, 46, 48, 44, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 110, 117, 108, 108, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 84, 121, 112, 101, 34, 58, 116, 114, 117, 101, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 83, 117, 98, 116, 121, 112, 101, 34, 58, 116, 114, 117, 101, 44, 34, 99, 111, 110, 99, 114, 101, 116, 101, 34, 58, 102, 97, 108, 115, 101, 125]), RecordHeader(key = spring_json_header_types, value = [123, 34, 114, 101, 102, 101, 114, 101, 114, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 45, 108, 101, 110, 103, 116, 104, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 76, 111, 110, 103, 34, 44, 34, 104, 116, 116, 112, 95, 114, 101, 113, 117, 101, 115, 116, 77, 101, 116, 104, 111, 100, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 104, 111, 115, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 104, 116, 116, 112, 95, 114, 101, 113, 117, 101, 115, 116, 85, 114, 108, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 84, 121, 112, 101, 34, 58, 34, 111, 114, 103, 46, 115, 112, 114, 105, 110, 103, 102, 114, 97, 109, 101, 119, 111, 114, 107, 46, 104, 116, 116, 112, 46, 77, 101, 100, 105, 97, 84, 121, 112, 101, 34, 44, 34, 117, 115, 101, 114, 45, 97, 103, 101, 110, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 97, 99, 99, 101, 112, 116, 34, 58, 34, 111, 114, 103, 46, 115, 112, 114, 105, 110, 103, 102, 114, 97, 109, 101, 119, 111, 114, 107, 46, 104, 116, 116, 112, 46, 77, 101, 100, 105, 97, 84, 121, 112, 101, 34, 125])], isReadOnly = false), key = null, value = [B#4bc28689)
org.springframework.messaging.MessageHandlingException: nested exception is org.springframework.expression.spel.SpelEvaluationException: EL1004E: Method call: Method toUpperCase() cannot be found on type byte[]
at org.springframework.integration.handler.MethodInvokingMessageProcessor.processMessage(MethodInvokingMessageProcessor.java:107) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.handler.ServiceActivatingHandler.handleRequestMessage(ServiceActivatingHandler.java:93) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:109) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:158) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:116) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:132) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:105) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:73) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:445) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:394) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:181) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:160) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:108) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:203) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$300(KafkaMessageDrivenChannelAdapter.java:70) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:387) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:364) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.lambda$onMessage$0(RetryingMessageListenerAdapter.java:120) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter$$Lambda$659/1406308390.doWithRetry(Unknown Source) ~[na:na]
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:287) ~[spring-retry-1.2.2.RELEASE.jar!/:na]
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:211) ~[spring-retry-1.2.2.RELEASE.jar!/:na]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:114) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:40) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1071) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1051) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:998) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:866) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:724) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_45]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_45]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
Since i've provided the Content-Type header in HTTP request, and after reading this blog post, I assumed that during message conversion the payload of the message (I understand the default wire format for Kafka is byte[]) would then be converted to a String representation. However, the type of the Message.payload that TransformProcessorConfiguration.transform receives is still byte[].
Does this behavior have something to do with the fact that the Content-Type header appears as a NonTrustedHeaderType in the MessagingMessageConverter.toMessage() call? Stepping through with the debugger shows the following for the contentType header:
headerValue = {"type":"text","subtype":"plain","parameters":{"charset":"UTF-8"},"qualityValue":1.0,"charset":"UTF-8","wildcardType":false,"wildcardSubtype":false,"concrete":true}
untrustedType = "org.springframework.http.MediaType"
This is the list of rawHeaders that the MessagingMessageConverter resolves:
"referer"->"http://localhost:8080"
"content-length"->"17"
"http_requestMethod"->"POST"
"kafka_timestampType"->"CREATE_TIME"
"kafka_receivedMessageKey"->"null"
"kafka_receivedTopic"->"edded.http"
"accept"->"NonTrustedHeaderType
"kafka_offset"->"1"
"scst_nativeHeadersPresent"->"true"
"kafka_consumer"->
"host"->"localhost:9191"
"http_requestUrl"->"http://localhost:9191/"
"kafka_receivedPartitionId"->"0"
"contentType"->"NonTrustedHeaderType
"kafka_receivedTimestamp"->"1531296520235"
"user-agent"->"Mozilla/5.0
Another potentially related issue that I found is described here. However, I have no clue how to control mappers trustedPackages via binder properties, if that is at all related to my problem.
I also tried setting app.*.spring.cloud.stream.bindings.input.producer.headerMode=raw in the deployment properties, did not have any effect.
Thanks!

Actually the blog you pointed should not result in the assumption that there will be conversion based on the content-type header. Conversion is done only based on the type required by the handler and if such type is generic (i.e., Object) or byte[], no conversion will be performed. What is the signature of the TransformProcessorConfiguration.transform(..) method? Also, if you are attempting to do any kind of SPEL evaluation on the payload, you must assume that it is always a byte[] since conversion will happen only when a handler method is about to be invoked, so if you are using some expression in condition on the payload and assume String, don't.

Related

Spring Cloud Data Flow urlExpression of httpclient cannot be set properly

I parse image urls from JSON produced by Twitter with Spring Cloud Data Flow and I'd like to download the image with httpclient.
Here is the pipeline:
twitterstream --twitter.credentials.consumerKey=*** --twitter.credentials.consumerSecret=*** --twitter.credentials.accessToken=*** --twitter.credentials.accessTokenSecret=*** | splitter --expression=#jsonPath(payload,'$.entities.media[*].media_url') | httpclient --httpclient.httpMethod=GET --httpclient.urlExpression=payload | log
If I exclude the httpclient the following log appears in the log, therefore I suppose the url extraction is successful and the httpclient gets the url.
2019-12-21 12:46:23.120 INFO 1 --- [container-0-C-1] log-sink : http://pbs.twimg.com/media/EMBcr-XVAAARY7x.png
I get the following exception from the httpclient (URI is not absolute)
2019-12-21 19:17:06.741 ERROR 1 --- [container-0-C-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: ConsumerRecord(topic = test.splitter, partition = 0, offset = 74, CreateTime = 1576954352460, serialized key size = -1, serialized value size = 87, headers = RecordHeaders(headers = [RecordHeader(key = sequenceNumber, value = [49]), RecordHeader(key = sequenceSize, value = [49]), RecordHeader(key = deliveryAttempt, value = [49]), RecordHeader(key = scst_nativeHeadersPresent, value = [116, 114, 117, 101]), RecordHeader(key = correlationId, value = [34, 56, 98, 57, 100, 56, 99, 57, 99, 45, 49, 100, 52, 50, 45, 55, 97, 101, 54, 45, 50, 54, 49, 55, 45, 50, 98, 102, 52, 101, 51, 100, 99, 98, 57, 55, 50, 34]), RecordHeader(key = contentType, value = [34, 97, 112, 112, 108, 105, 99, 97, 116, 105, 111, 110, 47, 106, 115, 111, 110, 34]), RecordHeader(key = spring_json_header_types, value = [123, 34, 115, 101, 113, 117, 101, 110, 99, 101, 78, 117, 109, 98, 101, 114, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 73, 110, 116, 101, 103, 101, 114, 34, 44, 34, 115, 99, 115, 116, 95, 110, 97, 116, 105, 118, 101, 72, 101, 97, 100, 101, 114, 115, 80, 114, 101, 115, 101, 110, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 66, 111, 111, 108, 101, 97, 110, 34, 44, 34, 115, 101, 113, 117, 101, 110, 99, 101, 83, 105, 122, 101, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 73, 110, 116, 101, 103, 101, 114, 34, 44, 34, 100, 101, 108, 105, 118, 101, 114, 121, 65, 116, 116, 101, 109, 112, 116, 34, 58, 34, 106, 97, 118, 97, 46, 117, 116, 105, 108, 46, 99, 111, 110, 99, 117, 114, 114, 101, 110, 116, 46, 97, 116, 111, 109, 105, 99, 46, 65, 116, 111, 109, 105, 99, 73, 110, 116, 101, 103, 101, 114, 34, 44, 34, 99, 111, 114, 114, 101, 108, 97, 116, 105, 111, 110, 73, 100, 34, 58, 34, 106, 97, 118, 97, 46, 117, 116, 105, 108, 46, 85, 85, 73, 68, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 84, 121, 112, 101, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 125])], isReadOnly = false), key = null, value = [B#73d60921)
org.springframework.integration.transformer.MessageTransformationException: Failed to transform Message; nested exception is org.springframework.messaging.MessageHandlingException: nested exception is java.lang.IllegalArgumentException: URI is not absolute, failedMessage=GenericMessage [payload=byte[87], headers={sequenceNumber=1, sequenceSize=1, deliveryAttempt=3, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedTopic=test.splitter, kafka_offset=74, scst_nativeHeadersPresent=true, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#103e165f, correlationId=8b9d8c9c-1d42-7ae6-2617-2bf4e3dcb972, kafka_receivedPartitionId=0, contentType=application/json, kafka_receivedTimestamp=1576954352460}]
at org.springframework.integration.transformer.MessageTransformingHandler.handleRequestMessage(MessageTransformingHandler.java:115) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:123) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:169) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:115) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:132) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:105) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:73) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:453) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:401) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:187) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:166) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:109) ~[spring-messaging-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:205) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.sendMessageIfAny(KafkaMessageDrivenChannelAdapter.java:369) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$400(KafkaMessageDrivenChannelAdapter.java:74) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:431) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:402) ~[spring-integration-kafka-3.1.0.RELEASE.jar!/:3.1.0.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.lambda$onMessage$0(RetryingMessageListenerAdapter.java:120) ~[spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:287) ~[spring-retry-1.2.4.RELEASE.jar!/:na]
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:211) ~[spring-retry-1.2.4.RELEASE.jar!/:na]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:114) ~[spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:40) ~[spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:1278) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:1261) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1222) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1203) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:1123) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:938) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:751) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:700) [spring-kafka-2.2.8.RELEASE.jar!/:2.2.8.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_192]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_192]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_192]
Caused by: org.springframework.messaging.MessageHandlingException: nested exception is java.lang.IllegalArgumentException: URI is not absolute
at org.springframework.integration.handler.LambdaMessageProcessor.processMessage(LambdaMessageProcessor.java:111) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.transformer.AbstractMessageProcessingTransformer.transform(AbstractMessageProcessingTransformer.java:113) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
at org.springframework.integration.transformer.MessageTransformingHandler.handleRequestMessage(MessageTransformingHandler.java:109) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
... 33 common frames omitted
Caused by: java.lang.IllegalArgumentException: URI is not absolute
at java.net.URI.toURL(URI.java:1088) ~[na:1.8.0_192]
at org.springframework.http.client.SimpleClientHttpRequestFactory.createRequest(SimpleClientHttpRequestFactory.java:145) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.http.client.support.HttpAccessor.createRequest(HttpAccessor.java:87) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:731) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:637) ~[spring-web-5.1.9.RELEASE.jar!/:5.1.9.RELEASE]
at org.springframework.cloud.stream.app.httpclient.processor.HttpclientProcessorFunctionConfiguration.lambda$httpRequest$0(HttpclientProcessorFunctionConfiguration.java:102) ~[spring-cloud-starter-stream-processor-httpclient-2.1.2.RELEASE.jar!/:2.1.2.RELEASE]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_192]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_192]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_192]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_192]
at org.springframework.integration.handler.LambdaMessageProcessor.processMessage(LambdaMessageProcessor.java:102) ~[spring-integration-core-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
... 35 common frames omitted
I think the problem is with the --httpclient.urlExpression=payload parameter, where I try to reference the url by the payload keyword. What's the problem exactly?
UPDATE:
I think the problem is that the payload is a series of ASCII character codes. How can I create a string from that?
The problem is that the payload is a series of ASCII codes and I solved the issue by setting --httpclient.urlExpression='new String(payload)'. It converts the ASCII codes to a string, but I think it's not the best solution so I'm waiting for a better one.

springboot-kafka java 8 time serialization

Currently working with spring-boot 2.0.4 with spring-kafka 2.1.8.RELEASE.
I've wanted to simplify the interchange a bit sending objects to kafka template and used json as format. Some of the messages that needs to be deserialized however contains java.time.LocalDateTime. So my setup is
Config (application.yml):
spring:
jackson:
serialization:
write_dates_as_timestamps: false
kafka:
consumer:
group-id: foo
enable-auto-commit: true
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties:
spring.json.trusted.packages: my.package
producer:
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
key-serializer: org.apache.kafka.common.serialization.StringSerializer
properties:
spring.json.trusted.packages: my.package
retries: 3
acks: all
as for the jackson dependencies which is supposed to be needed for it to work, my dependency tree is:
[INFO] | | +- com.fasterxml.jackson.core:jackson-databind:jar:2.9.6:compile
[INFO] | | | +- com.fasterxml.jackson.core:jackson-annotations:jar:2.9.0:compile
[INFO] | | | \- com.fasterxml.jackson.core:jackson-core:jar:2.9.6:compile
[INFO] | | \- com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar:2.9.6:compile
[INFO] | | +- com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar:2.9.6:compile
[INFO] | | \- com.fasterxml.jackson.module:jackson-module-parameter-names:jar:2.9.6:compile
This however produces the following error:
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition Foo-0 at offset 4. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data [[123, 34, 105, 100, 34, 58, 34, 97, 50, 99, 50, 56, 99, 99, 101, 97, 49, 98, 98, 52, 51, 97, 97, 56, 53, 50, 49, 53, 99, 101, 49, 54, 57, 48, 52, 51, 51, 98, 51, 45, 50, 34, 44, 34, 97, 117, 116, 104, 111, 114, 34, 58, 34, 97, 110, 116, 111, 110, 105, 111, 34, 44, 34, 99, 114, 101, 97, 116, 101, 100, 34, 58, 123, 34, 104, 111, 117, 114, 34, 58, 49, 56, 44, 34, 109, 105, 110, 117, 116, 101, 34, 58, 52, 48, 44, 34, 115, 101, 99, 111, 110, 100, 34, 58, 53, 49, 44, 34, 110, 97, 110, 111, 34, 58, 51, 50, 53, 48, 48, 48, 48, 48, 48, 44, 34, 100, 97, 121, 79, 102, 89, 101, 97, 114, 34, 58, 50, 52, 48, 44, 34, 100, 97, 121, 79, 102, 87, 101, 101, 107, 34, 58, 34, 84, 85, 69, 83, 68, 65, 89, 34, 44, 34, 109, 111, 110, 116, 104, 34, 58, 34, 65, 85, 71, 85, 83, 84, 34, 44, 34, 100, 97, 121, 79, 102, 77, 111, 110, 116, 104, 34, 58, 50, 56, 44, 34, 121, 101, 97, 114, 34, 58, 50, 48, 49, 56, 44, 34, 109, 111, 110, 116, 104, 86, 97, 108, 117, 101, 34, 58, 56, 44, 34, 99, 104, 114, 111, 110, 111, 108, 111, 103, 121, 34, 58, 123, 34, 99, 97, 108, 101, 110, 100, 97, 114, 84, 121, 112, 101, 34, 58, 34, 105, 115, 111, 56, 54, 48, 49, 34, 44, 34, 105, 100, 34, 58, 34, 73, 83, 79, 34, 125, 125, 44, 34, 97, 103, 103, 114, 101, 103, 97, 116, 101, 73, 100, 34, 58, 34, 97, 50, 99, 50, 56, 99, 99, 101, 97, 49, 98, 98, 52, 51, 97, 97, 56, 53, 50, 49, 53, 99, 101, 49, 54, 57, 48, 52, 51, 51, 98, 51, 34, 44, 34, 118, 101, 114, 115, 105, 111, 110, 34, 58, 48, 44, 34, 112, 114, 105, 122, 101, 73, 110, 102, 111, 34, 58, 123, 34, 110, 117, 109, 98, 101, 114, 79, 102, 87, 105, 110, 110, 101, 114, 115, 34, 58, 49, 44, 34, 112, 114, 105, 122, 101, 80, 111, 111, 108, 34, 58, 49, 48, 44, 34, 112, 114, 105, 122, 101, 84, 97, 98, 108, 101, 34, 58, 91, 49, 48, 93, 125, 125]] from topic [Foo]
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Expected array or string.
at [Source: (byte[])"{"id":"a2c28ccea1bb43aa85215ce1690433b3-2","author":"foo","created":{"hour":18,"minute":40,"second":51,"nano":325000000,"dayOfYear":240,"dayOfWeek":"TUESDAY","month":"AUGUST","dayOfMonth":28,"year":2018,"monthValue":8,"chronology":{"calendarType":"iso8601","id":"ISO"}},"aggregateId":"a2c28ccea1bb43aa85215ce1690433b3","version":0,"prizeInfo":{"numberOfWinners":1,"prizePool":10,"prizeTable":[10]}}"; line: 1, column: 73] (through reference chain: my.package.Foo["created"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1342) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1138) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.datatype.jsr310.deser.JSR310DeserializerBase._handleUnexpectedToken(JSR310DeserializerBase.java:99) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:141) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:39) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:136) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:369) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1611) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1234) ~[jackson-databind-2.9.6.jar:2.9.6]
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:228) ~[spring-kafka-2.1.8.RELEASE.jar:2.1.8.RELEASE]
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:923) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.access$2600(Fetcher.java:93) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1100) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1200(Fetcher.java:949) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:570) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:531) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1154) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1111) ~[kafka-clients-1.0.2.jar:na]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:699) ~[spring-kafka-2.1.8.RELEASE.jar:2.1.8.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Due to this i have tried the following but non had worked so far:
1.Custom ObjectMapper declared as bean
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
return objectMapper;
}
2.Serializer annotation on LocalDateTime fields
To be sure that i have the correct object mapper settings and the necessary dependencies, i've created a rest controller to simulate the response as json as a rest endpoint returning an object with date time fields, this returns correctly; sample:
[
{
"playerId": "foo",
"points": 10,
"entryDateTime": "2018-08-19T09:30:20.051"
},
{
"playerId": "bar",
"points": 3,
"entryDateTime": "2018-08-27T09:30:20.051"
}
]
Using the Json(De)Serializer constructor with the object mapper param worked for me. I was having trouble (de)serializing a pojo that had an java.time.Instant field, so after hours of troubleshooting this same org.apache.kafka.common.errors.SerializationException***, I finally realized (with the help of answers such as those on here) that the issue is not spring, but kafka's own serialization. Given the objectmapper bean I had, I resolved by autowiring this into the JsonSerializer and JsonDeserializer parameters of my kafka producer and consumer set-ups.
#Configuration
public class JacksonConfig {
#Bean
#Primary
public ObjectMapper objectMapper(Jackson2ObjectMapperBuilder builder) {
ObjectMapper objectMapper = builder.build();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
return objectMapper;
}
}
#Configuration
public class KafkaProducerConfig {
#Value(value="${kafka.bootstrapAddress}")
private String bootstrapAddress;
#Autowired
private ObjectMapper objectMapper;
#Bean
public KafkaTemplate<String, Order> orderKafkaTemplate(){
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
ProducerFactory<String, Order> producerFactory = new DefaultKafkaProducerFactory<>(props, new StringSerializer(), new JsonSerializer<Order>(objectMapper));
return new KafkaTemplate<>(producerFactory);
}
}
#Configuration
public class KafkaConsumerConfig {
#Value(value="${kafka.bootstrapAddress}")
private String bootstrapAddress;
#Value(value="${kafka.consumer.groupId}")
private String groupId;
#Autowired
private ObjectMapper objectMapper;
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Order> orderKafkaListenerContainerFactory(){
ConcurrentKafkaListenerContainerFactory<String, Order> factory = new ConcurrentKafkaListenerContainerFactory<>();
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
ConsumerFactory<String, Order> consumerFactory = new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(Order.class, objectMapper));
factory.setConsumerFactory(consumerFactory);
return factory;
}
}
(Pojo shown for further clarity)
public class Order {
private long accountId;
private long assetId;
private long quantity;
private long price;
private Instant createdOn = Instant.now();
// no args constructor, constructor with params for all fields except createdOn, and getters/setters for all fields omitted
***often the cause was: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of 'java.time.Instant' (no Creators, like default construct, exist): cannot deserialize from object value (no delegate- or property-based Creator) at [Source: (byte[])"{"accountId":1,"assetId":2,"quantity":100,"price":1000,"createdOn":{"epochSecond":1558570217,"nano":728000000}}"...
When you set the serializers/deserializers using properties, Kafka instantiates them, not Spring. Kafka knows nothing about Spring or the customized ObjectMapper.
You need to override Boot's default producer/consumer factories and use the alternate constructors (or setters) to add the serializers/deserializers.
See the documentation.
Important
Only simple configuration can be performed with properties; for more advanced configuration (such as using a custom ObjectMapper in the serializer/deserializer), you should use the producer/consumer factory constructors that accept a pre-built serializer and deserializer. For example, with Spring Boot, to override the default factories:
#Bean
public ConsumerFactory<Foo, Bar> kafkaConsumerFactory(KafkaProperties properties,
JsonDeserializer customDeserializer) {
return new DefaultKafkaConsumerFactory<>(properties.buildConsumerProperties(),
customDeserializer, customDeserializer);
}
#Bean
public ProducererFactory<Foo, Bar> kafkaProducerFactory(KafkaProperties properties,
JsonSserializer customSerializer) {
return new DefaultKafkaConsumerFactory<>(properties.buildProducerProperties(),
customSerializer, customSerializer);
}
Setters are also provided, as an alternative to using these constructors.
You can extend Spring Kafka's JsonSerializer:
public class JsonSerializerWithJTM<T> extends JsonSerializer<T> {
public JsonSerializerWithJTM() {
super();
objectMapper.registerModule(new JavaTimeModule());
//whatever you want to configure here
}
}
Use this class in Kafka's configuration instead of the original one:
spring:
kafka:
consumer:
value-deserializer: com.foo.JsonSerializerWithJTM

A fast look up value in vectors MATLAB code comparion

I am using MATLAB to look up the value in two vectors OCT_EXP, OCT_LOG, from two input values u,v and output is val as condition
if (( u == 0 )||( v == 0 ))
val = 0;
else
val = OCT_EXP( OCT_LOG(u) + OCT_LOG(v) + 1);
I tried to use three ways: normal way (no_vectorized way), vectorized way, and mex way. I expected that mex way will be the best way, then vectorized way. However, when I measure time consumption, the first way (no vectorized way) is best, the vectorized way is worst way. What is happen in my code? Thank all
I want to consider the speed up of the function because it will be called many time: 300.000 times
The first way:
function val = gfmult_no_vec( u, v )
OCT_EXP = [ 1, 2, 4, 8, 16, 32, 64, 128, 29, 58, 116, 232, 205, 135, 19, 38,...
76, 152, 45, 90, 180, 117, 234, 201, 143, 3, 6, 12, 24, 48, 96, 192, 157,...
39, 78, 156, 37, 74, 148, 53, 106, 212, 181, 119, 238, 193, 159, 35,...
70, 140, 5, 10, 20, 40, 80, 160, 93, 186, 105, 210, 185, 111, 222,...
161, 95, 190, 97, 194, 153, 47, 94, 188, 101, 202, 137, 15, 30, 60,...
120, 240, 253, 231, 211, 187, 107, 214, 177, 127, 254, 225, 223, 163,...
91, 182, 113, 226, 217, 175, 67, 134, 17, 34, 68, 136, 13, 26, 52,...
104, 208, 189, 103, 206, 129, 31, 62, 124, 248, 237, 199, 147, 59,...
118, 236, 197, 151, 51, 102, 204, 133, 23, 46, 92, 184, 109, 218,...
169, 79, 158, 33, 66, 132, 21, 42, 84, 168, 77, 154, 41, 82, 164, 85,...
170, 73, 146, 57, 114, 228, 213, 183, 115, 230, 209, 191, 99, 198,...
145, 63, 126, 252, 229, 215, 179, 123, 246, 241, 255, 227, 219, 171,...
75, 150, 49, 98, 196, 149, 55, 110, 220, 165, 87, 174, 65, 130, 25,...
50, 100, 200, 141, 7, 14, 28, 56, 112, 224, 221, 167, 83, 166, 81,...
162, 89, 178, 121, 242, 249, 239, 195, 155, 43, 86, 172, 69, 138, 9,...
18, 36, 72, 144, 61, 122, 244, 245, 247, 243, 251, 235, 203, 139, 11,...
22, 44, 88, 176, 125, 250, 233, 207, 131, 27, 54, 108, 216, 173, 71,...
142, 1, 2, 4, 8, 16, 32, 64, 128, 29, 58, 116, 232, 205, 135, 19, 38,...
76, 152, 45, 90, 180, 117, 234, 201, 143, 3, 6, 12, 24, 48, 96, 192,...
157, 39, 78, 156, 37, 74, 148, 53, 106, 212, 181, 119, 238, 193, 159,...
35, 70, 140, 5, 10, 20, 40, 80, 160, 93, 186, 105, 210, 185, 111,...
222, 161, 95, 190, 97, 194, 153, 47, 94, 188, 101, 202, 137, 15, 30,...
60, 120, 240, 253, 231, 211, 187, 107, 214, 177, 127, 254, 225, 223,...
163, 91, 182, 113, 226, 217, 175, 67, 134, 17, 34, 68, 136, 13, 26,...
52, 104, 208, 189, 103, 206, 129, 31, 62, 124, 248, 237, 199, 147,...
59, 118, 236, 197, 151, 51, 102, 204, 133, 23, 46, 92, 184, 109, 218,...
169, 79, 158, 33, 66, 132, 21, 42, 84, 168, 77, 154, 41, 82, 164, 85,...
170, 73, 146, 57, 114, 228, 213, 183, 115, 230, 209, 191, 99, 198,...
145, 63, 126, 252, 229, 215, 179, 123, 246, 241, 255, 227, 219, 171,...
75, 150, 49, 98, 196, 149, 55, 110, 220, 165, 87, 174, 65, 130, 25,...
50, 100, 200, 141, 7, 14, 28, 56, 112, 224, 221, 167, 83, 166, 81,...
162, 89, 178, 121, 242, 249, 239, 195, 155, 43, 86, 172, 69, 138, 9,...
18, 36, 72, 144, 61, 122, 244, 245, 247, 243, 251, 235, 203, 139, 11,...
22, 44, 88, 176, 125, 250, 233, 207, 131, 27, 54, 108, 216, 173, 71,...
142 ];
OCT_LOG = [ 0, 1, 25, 2, 50, 26, 198, 3, 223, 51, 238, 27, 104, 199, 75, 4,...
100, 224, 14, 52, 141, 239, 129, 28, 193, 105, 248, 200, 8, 76, 113, 5,...
138, 101, 47, 225, 36, 15, 33, 53, 147, 142, 218, 240, 18, 130, 69,...
29, 181, 194, 125, 106, 39, 249, 185, 201, 154, 9, 120, 77, 228, 114,... end
166, 6, 191, 139, 98, 102, 221, 48, 253, 226, 152, 37, 179, 16, 145,...
34, 136, 54, 208, 148, 206, 143, 150, 219, 189, 241, 210, 19, 92,...
131, 56, 70, 64, 30, 66, 182, 163, 195, 72, 126, 110, 107, 58, 40,...
84, 250, 133, 186, 61, 202, 94, 155, 159, 10, 21, 121, 43, 78, 212,...
229, 172, 115, 243, 167, 87, 7, 112, 192, 247, 140, 128, 99, 13, 103,...
74, 222, 237, 49, 197, 254, 24, 227, 165, 153, 119, 38, 184, 180,...
124, 17, 68, 146, 217, 35, 32, 137, 46, 55, 63, 209, 91, 149, 188,...
207, 205, 144, 135, 151, 178, 220, 252, 190, 97, 242, 86, 211, 171,...
20, 42, 93, 158, 132, 60, 57, 83, 71, 109, 65, 162, 31, 45, 67, 216,...
183, 123, 164, 118, 196, 23, 73, 236, 127, 12, 111, 246, 108, 161,...
59, 82, 41, 157, 85, 170, 251, 96, 134, 177, 187, 204, 62, 90, 203,...
89, 95, 176, 156, 169, 160, 81, 11, 245, 22, 235, 122, 117, 44, 215,...
79, 174, 213, 233, 230, 231, 173, 232, 116, 214, 244, 234, 168, 80,...
88, 175 ];
if (( u == 0 )||( v == 0 ))
val = 0;
else
val = OCT_EXP( OCT_LOG(u) + OCT_LOG(v) + 1);
The second way: Vectorized way
function val = gfmult_vec( u, v )
OCT_EXP = [ 1, 2, 4, 8, 16, 32, 64, 128, 29, 58, 116, 232, 205, 135, 19, 38,...
76, 152, 45, 90, 180, 117, 234, 201, 143, 3, 6, 12, 24, 48, 96, 192, 157,...
39, 78, 156, 37, 74, 148, 53, 106, 212, 181, 119, 238, 193, 159, 35,...
70, 140, 5, 10, 20, 40, 80, 160, 93, 186, 105, 210, 185, 111, 222,...
161, 95, 190, 97, 194, 153, 47, 94, 188, 101, 202, 137, 15, 30, 60,...
120, 240, 253, 231, 211, 187, 107, 214, 177, 127, 254, 225, 223, 163,...
91, 182, 113, 226, 217, 175, 67, 134, 17, 34, 68, 136, 13, 26, 52,...
104, 208, 189, 103, 206, 129, 31, 62, 124, 248, 237, 199, 147, 59,...
118, 236, 197, 151, 51, 102, 204, 133, 23, 46, 92, 184, 109, 218,...
169, 79, 158, 33, 66, 132, 21, 42, 84, 168, 77, 154, 41, 82, 164, 85,...
170, 73, 146, 57, 114, 228, 213, 183, 115, 230, 209, 191, 99, 198,...
145, 63, 126, 252, 229, 215, 179, 123, 246, 241, 255, 227, 219, 171,...
75, 150, 49, 98, 196, 149, 55, 110, 220, 165, 87, 174, 65, 130, 25,...
50, 100, 200, 141, 7, 14, 28, 56, 112, 224, 221, 167, 83, 166, 81,...
162, 89, 178, 121, 242, 249, 239, 195, 155, 43, 86, 172, 69, 138, 9,...
18, 36, 72, 144, 61, 122, 244, 245, 247, 243, 251, 235, 203, 139, 11,...
22, 44, 88, 176, 125, 250, 233, 207, 131, 27, 54, 108, 216, 173, 71,...
142, 1, 2, 4, 8, 16, 32, 64, 128, 29, 58, 116, 232, 205, 135, 19, 38,...
76, 152, 45, 90, 180, 117, 234, 201, 143, 3, 6, 12, 24, 48, 96, 192,...
157, 39, 78, 156, 37, 74, 148, 53, 106, 212, 181, 119, 238, 193, 159,...
35, 70, 140, 5, 10, 20, 40, 80, 160, 93, 186, 105, 210, 185, 111,...
222, 161, 95, 190, 97, 194, 153, 47, 94, 188, 101, 202, 137, 15, 30,...
60, 120, 240, 253, 231, 211, 187, 107, 214, 177, 127, 254, 225, 223,...
163, 91, 182, 113, 226, 217, 175, 67, 134, 17, 34, 68, 136, 13, 26,...
52, 104, 208, 189, 103, 206, 129, 31, 62, 124, 248, 237, 199, 147,...
59, 118, 236, 197, 151, 51, 102, 204, 133, 23, 46, 92, 184, 109, 218,...
169, 79, 158, 33, 66, 132, 21, 42, 84, 168, 77, 154, 41, 82, 164, 85,...
170, 73, 146, 57, 114, 228, 213, 183, 115, 230, 209, 191, 99, 198,...
145, 63, 126, 252, 229, 215, 179, 123, 246, 241, 255, 227, 219, 171,...
75, 150, 49, 98, 196, 149, 55, 110, 220, 165, 87, 174, 65, 130, 25,...
50, 100, 200, 141, 7, 14, 28, 56, 112, 224, 221, 167, 83, 166, 81,...
162, 89, 178, 121, 242, 249, 239, 195, 155, 43, 86, 172, 69, 138, 9,...
18, 36, 72, 144, 61, 122, 244, 245, 247, 243, 251, 235, 203, 139, 11,...
22, 44, 88, 176, 125, 250, 233, 207, 131, 27, 54, 108, 216, 173, 71,...
142 ];
OCT_LOG = [ 0, 1, 25, 2, 50, 26, 198, 3, 223, 51, 238, 27, 104, 199, 75, 4,...
100, 224, 14, 52, 141, 239, 129, 28, 193, 105, 248, 200, 8, 76, 113, 5,...
138, 101, 47, 225, 36, 15, 33, 53, 147, 142, 218, 240, 18, 130, 69,...
29, 181, 194, 125, 106, 39, 249, 185, 201, 154, 9, 120, 77, 228, 114,...
166, 6, 191, 139, 98, 102, 221, 48, 253, 226, 152, 37, 179, 16, 145,...
34, 136, 54, 208, 148, 206, 143, 150, 219, 189, 241, 210, 19, 92,...
131, 56, 70, 64, 30, 66, 182, 163, 195, 72, 126, 110, 107, 58, 40,...
84, 250, 133, 186, 61, 202, 94, 155, 159, 10, 21, 121, 43, 78, 212,...
229, 172, 115, 243, 167, 87, 7, 112, 192, 247, 140, 128, 99, 13, 103,...
74, 222, 237, 49, 197, 254, 24, 227, 165, 153, 119, 38, 184, 180,...
124, 17, 68, 146, 217, 35, 32, 137, 46, 55, 63, 209, 91, 149, 188,...
207, 205, 144, 135, 151, 178, 220, 252, 190, 97, 242, 86, 211, 171,...
20, 42, 93, 158, 132, 60, 57, 83, 71, 109, 65, 162, 31, 45, 67, 216,...
183, 123, 164, 118, 196, 23, 73, 236, 127, 12, 111, 246, 108, 161,...
59, 82, 41, 157, 85, 170, 251, 96, 134, 177, 187, 204, 62, 90, 203,...
89, 95, 176, 156, 169, 160, 81, 11, 245, 22, 235, 122, 117, 44, 215,...
79, 174, 213, 233, 230, 231, 173, 232, 116, 214, 244, 234, 168, 80,...
88, 175 ];
uv0 = (~(( u == 0 )|( v == 0 )));
val = zeros(size(u));
val(uv0) = OCT_EXP( OCT_LOG(u(uv0)) + OCT_LOG(v(uv0)) + 1);
end
The last way: mex code
#include "mex.h"
double look_up(double u, double v)
{
double OCT_EXP [510] = { 1, 2, 4, 8, 16, 32, 64, 128, 29, 58, 116, 232, 205, 135, 19, 38,
76, 152, 45, 90, 180, 117, 234, 201, 143, 3, 6, 12, 24, 48, 96, 192, 157,
39, 78, 156, 37, 74, 148, 53, 106, 212, 181, 119, 238, 193, 159, 35,
70, 140, 5, 10, 20, 40, 80, 160, 93, 186, 105, 210, 185, 111, 222,
161, 95, 190, 97, 194, 153, 47, 94, 188, 101, 202, 137, 15, 30, 60,
120, 240, 253, 231, 211, 187, 107, 214, 177, 127, 254, 225, 223, 163,
91, 182, 113, 226, 217, 175, 67, 134, 17, 34, 68, 136, 13, 26, 52,
104, 208, 189, 103, 206, 129, 31, 62, 124, 248, 237, 199, 147, 59,
118, 236, 197, 151, 51, 102, 204, 133, 23, 46, 92, 184, 109, 218,
169, 79, 158, 33, 66, 132, 21, 42, 84, 168, 77, 154, 41, 82, 164, 85,
170, 73, 146, 57, 114, 228, 213, 183, 115, 230, 209, 191, 99, 198,
145, 63, 126, 252, 229, 215, 179, 123, 246, 241, 255, 227, 219, 171,
75, 150, 49, 98, 196, 149, 55, 110, 220, 165, 87, 174, 65, 130, 25,
50, 100, 200, 141, 7, 14, 28, 56, 112, 224, 221, 167, 83, 166, 81,
162, 89, 178, 121, 242, 249, 239, 195, 155, 43, 86, 172, 69, 138, 9,
18, 36, 72, 144, 61, 122, 244, 245, 247, 243, 251, 235, 203, 139, 11,
22, 44, 88, 176, 125, 250, 233, 207, 131, 27, 54, 108, 216, 173, 71,
142, 1, 2, 4, 8, 16, 32, 64, 128, 29, 58, 116, 232, 205, 135, 19, 38,
76, 152, 45, 90, 180, 117, 234, 201, 143, 3, 6, 12, 24, 48, 96, 192,
157, 39, 78, 156, 37, 74, 148, 53, 106, 212, 181, 119, 238, 193, 159,
35, 70, 140, 5, 10, 20, 40, 80, 160, 93, 186, 105, 210, 185, 111,
222, 161, 95, 190, 97, 194, 153, 47, 94, 188, 101, 202, 137, 15, 30,
60, 120, 240, 253, 231, 211, 187, 107, 214, 177, 127, 254, 225, 223,
163, 91, 182, 113, 226, 217, 175, 67, 134, 17, 34, 68, 136, 13, 26,
52, 104, 208, 189, 103, 206, 129, 31, 62, 124, 248, 237, 199, 147,
59, 118, 236, 197, 151, 51, 102, 204, 133, 23, 46, 92, 184, 109, 218,
169, 79, 158, 33, 66, 132, 21, 42, 84, 168, 77, 154, 41, 82, 164, 85,
170, 73, 146, 57, 114, 228, 213, 183, 115, 230, 209, 191, 99, 198,
145, 63, 126, 252, 229, 215, 179, 123, 246, 241, 255, 227, 219, 171,
75, 150, 49, 98, 196, 149, 55, 110, 220, 165, 87, 174, 65, 130, 25,
50, 100, 200, 141, 7, 14, 28, 56, 112, 224, 221, 167, 83, 166, 81,
162, 89, 178, 121, 242, 249, 239, 195, 155, 43, 86, 172, 69, 138, 9,
18, 36, 72, 144, 61, 122, 244, 245, 247, 243, 251, 235, 203, 139, 11,
22, 44, 88, 176, 125, 250, 233, 207, 131, 27, 54, 108, 216, 173, 71,
142 };
double OCT_LOG[255] = { 0, 1, 25, 2, 50, 26, 198, 3, 223, 51, 238, 27, 104, 199, 75, 4,
100, 224, 14, 52, 141, 239, 129, 28, 193, 105, 248, 200, 8, 76, 113, 5,
138, 101, 47, 225, 36, 15, 33, 53, 147, 142, 218, 240, 18, 130, 69,
29, 181, 194, 125, 106, 39, 249, 185, 201, 154, 9, 120, 77, 228, 114,
166, 6, 191, 139, 98, 102, 221, 48, 253, 226, 152, 37, 179, 16, 145,
34, 136, 54, 208, 148, 206, 143, 150, 219, 189, 241, 210, 19, 92,
131, 56, 70, 64, 30, 66, 182, 163, 195, 72, 126, 110, 107, 58, 40,
84, 250, 133, 186, 61, 202, 94, 155, 159, 10, 21, 121, 43, 78, 212,
229, 172, 115, 243, 167, 87, 7, 112, 192, 247, 140, 128, 99, 13, 103,
74, 222, 237, 49, 197, 254, 24, 227, 165, 153, 119, 38, 184, 180,
124, 17, 68, 146, 217, 35, 32, 137, 46, 55, 63, 209, 91, 149, 188,
207, 205, 144, 135, 151, 178, 220, 252, 190, 97, 242, 86, 211, 171,
20, 42, 93, 158, 132, 60, 57, 83, 71, 109, 65, 162, 31, 45, 67, 216,
183, 123, 164, 118, 196, 23, 73, 236, 127, 12, 111, 246, 108, 161,
59, 82, 41, 157, 85, 170, 251, 96, 134, 177, 187, 204, 62, 90, 203,
89, 95, 176, 156, 169, 160, 81, 11, 245, 22, 235, 122, 117, 44, 215,
79, 174, 213, 233, 230, 231, 173, 232, 116, 214, 244, 234, 168, 80,
88, 175 };
if (( u == 0 )||( v == 0 ))
return 0;
else
{
int index=OCT_LOG[int(u-1)] + OCT_LOG[int(v-1)] + 1;
return OCT_EXP [index-1];
}
}
void mexFunction(int nlhs, mxArray *plhs[], int nrhs, const mxArray *prhs[])
{
double *u, *v, *uv;
int mrows, ncols;
plhs[0] = mxCreateDoubleMatrix(1,1, mxREAL);
/* Assign pointers to each input and output. */
u = mxGetPr(prhs[0]);
v = mxGetPr(prhs[1]);
uv = mxGetPr(plhs[0]);
*uv = look_up(*u, *v);
}
The measurement code
function main
function test1()
for i=1:200
for j=1:200
gfmult_no_vec(i,j);
end
end
end
function test2()
for i=1:200
for j=1:200
gfmult_vec(i,j);
end
end
end
function test3()
for i=1:200
for j=1:200
gfmult_mex(i,j);
end
end
end
f1=#()test1();
t1=timeit(f1)
f2=#()test2();
t2=timeit(f2)
f3=#()test3();
t3=timeit(f3)
end
Report time:
t1 = 0.1934,
t2 = 1.1739,
t3 = 0.3584
This solution takes 0.0006 second in my computer (and it is vectorized):
u = randi(5,200,1)-1; % some arbitrary data including zeros
v = randi(5,200,1)-1; % some arbitrary data including zeros
[U,V] = ndgrid(u(u~=0),v(v~=0)); % make all possible combinations of u and v
val = zeros(length(u),length(v)); % initialize the output size, in case the last value in u or v is zero.
f = #(u,v) OCT_EXP(OCT_LOG(u)+OCT_LOG(v)+1);
val((u~=0),(v~=0)) = f(U,V);
Now val(u,v) = OCT_EXP(OCT_LOG(u)+OCT_LOG(v)+1) if u and v are both not zeros, otherwise val(u,v) = 0.
If you want gfmult to have a scalar input, then your first method seems to be the fastest way. However, I would define OCT_EXP and OCT_LOG outside the function and pass them to it, instead of assigning this values over and over:
function val = gfmult(OCT_EXP,OCT_LOG,u,v)
if (u==0)||(v==0)
val = 0;
else
val = OCT_EXP(OCT_LOG(u)+OCT_LOG(v)+1);
end
end
in my computer it reduces running time from 0.21444 (with your version) to 0.158 second for 100K iteration, which is not such a big improvement (0.05644 second), but if you have millions of those, it may be significant.

Elasticsearch + Oracle JDBC River

Maybe this is a simple question but im new at ElasticSearch.. Installed es 1.4, Oracle 10g is up and running, jdbc plugin loaded in ES without issue. Marvel also working.. Tried to create a river with this statement in Marvel/Sense:
PUT _river/mydata/_meta
{
"type": "jdbc",
"jdbc": {
"driver": "oracle.jdbc.OracleDriver",
"url": “jdbc:oracle:thin:#host:1521:SID",
"user": “oracleusr",
"password": “ oraclepass",
"index": “myindex",
"type": “mytype",
"sql": "select * from aTable"
}
}
And i get this error all the time:
{
"error": "MapperParsingException[failed to parse]; nested: ElasticsearchParseException[Failed to derive xcontent from (offset=0, length=323): [80, 85, 84, 32, 95, 114, 105, 118, 101, 114, 47, 109, 52, 99, 47, 95, 109, 101, 116, 97, 32, 10, 123, 10, 32, 32, 34, 116, 121, 112, 101, 34, 58, 32, 34, 106, 100, 98, 99, 34, 44, 10, 32, 32, 34, 106, 100, 98, 99, 34, 58, 32, 123, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 100, 114, 105, 118, 101, 114, 34, 58, 32, 34, 111, 114, 97, 99, 108, 101, 46, 106, 100, 98, 99, 46, 79, 114, 97, 99, 108, 101, 68, 114, 105, 118, 101, 114, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 117, 114, 108, 34, 58, 32, -30, -128, -100, 106, 100, 98, 99, 58, 111, 114, 97, 99, 108, 101, 58, 116, 104, 105, 110, 58, 64, 49, 48, 46, 49, 57, 52, 46, 49, 55, 46, 49, 55, 51, 58, 49, 53, 50, 49, 58, 109, 52, 99, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 117, 115, 101, 114, 34, 58, 32, -30, -128, -100, 109, 52, 99, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 112, 97, 115, 115, 119, 111, 114, 100, 34, 58, 32, -30, -128, -100, 32, 109, 52, 99, 49, 50, 48, 57, 48, 53, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 105, 110, 100, 101, 120, 34, 58, 32, -30, -128, -100, 109, 52, 99, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 116, 121, 112, 101, 34, 58, 32, -30, -128, -100, 112, 97, 105, 115, 34, 44, 10, 32, 32, 32, 32, 32, 32, 32, 32, 34, 115, 113, 108, 34, 58, 32, 34, 115, 101, 108, 101, 99, 116, 32, 42, 32, 102, 114, 111, 109, 32, 77, 52, 67, 80, 65, 73, 83, 34, 10, 32, 32, 32, 32, 32, 32, 125, 10, 125, 10]]; ",
"status": 400
}
on the CLI i get:
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:562)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:490)
at org.elasticsearch.index.shard.service.InternalIndexShard.prepareIndex(InternalIndexShard.java:413)
at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:189)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:511)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:419)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive xcontent from (offset=0, length=323)
I first though was a connection problem but checking around it seems its a problem with the mapping in ES & Oracle. Does someone have done ES+Oracle River integration? Any help would be really appreciated.
I found a solution that worked out really straigth forward. Instead of putting the command on the what i did is have a json config file like this:
{
"type" : "jdbc" ,
"jdbc" :{
"strategy" : "oneshot",
"driver" : "oracle.jdbc.OracleDriver",
"url" : "jdbc:oracle:thin:#host:1521:DNS",
"user" : "user",
"password" : "password",
"sql" : "select * from aTable",
"poll" : "1h",
"scale" : 0,
"autocommit" : false,
"fetchsize" : 100,
"max_rows" : 0,
"max_retries" : 3,
"max_retries_wait" : "10s"
},
"index" : {
"index" : "aIndex",
"type" : "aType",
"bulk_size" : 100
}
}
and then call it with curl like this:
'http://127.0.0.1:9200/_river/jdbcriver/_meta' -d #config.json
I don't know it's a parser issue or what but this way worked like a charm.
Thanks

Getting error mapper parsing exception while indexing

I am new to elasticsearch. I was trying to index the attachment but getting error.
i had executed following step.
Installed the Mapper-attachment plugin
i had converted text file to base64 with openssl command
openssl enc -base64 -in test3.txt -out t3.file
after that i had created mapping
[root#n1 testcase]# curl -XPUT 'http://localhost:9200/indextryes/?pretty=1' -d '
{ "mappings" : { "doc" : { "properties" : {"file" : {"type" : "attachment"}}}}}'
{
"ok" : true,
"acknowledged" : true
}
when i try to index it i got following error message
[root#n1 testcase]# curl -X POST "localhost:9200/indextryes/text" -d #t3.file
{"error":"MapperParsingException[failed to parse]; nested: ElasticSearchParseException[Failed to derive xcontent from (offset=0, length=64): [98, 71, 86, 48, 99, 121, 66, 107, 98, 121, 66, 104, 98, 109, 57, 48, 97, 71, 86, 121, 73, 72, 82, 108, 99, 51, 81, 103, 89, 87, 53, 107, 73, 72, 90, 108, 99, 109, 108, 109, 101, 83, 66, 108, 98, 71, 70, 122, 100, 71, 108, 106, 99, 50, 86, 104, 99, 109, 78, 111, 76, 103, 61, 61]]; ","status":400}
thanks for help...

Resources