Mule 4.1.4 Failing upload compressed xml file content through HTTP POST request - mule-component

I am migrating Mule 3.9 file upload logic to Mule 4.1.4 version. For simplicity, in Mule 4.1.4 I am trying with basic logic using http connector to upload compressed xml file content to post to HTTP POST request, it keep failing with BAD_REQUEST, not getting what is wrong with my input.
Please suggest what am I missing in it???
Mule 3.9 existing working code:
<flow name="Post_XML_To_ExtSystem" processingStrategy="synchronous">
<timer-interceptor/>
<object-to-byte-array-transformer doc:name="Object to Byte Array"/>
<gzip-compress-transformer doc:name="Gzip Compress"/>
<logger message="gZip compression completed for Part: #[flowVars.partId]" level="INFO" doc:name="gZip completed"/>
<flow-ref name="WriteToFile_Flow" doc:name="Write to File Optionally"/>
<set-variable variableName="fileContentgzip" value="#[payload]" doc:name="fileContentgzip"/>
<flow-ref name="SetAttachments_PostPayload_Flow" doc:name="SetAttachments_PostPayload_Flow - FlowRef"/>
<exception-strategy ref="Global_Errorflow_Choice_Exception_Strategy" doc:name="Reference Exception Strategy"/>
</flow>
<sub-flow name="SetAttachments_PostPayload_Flow">
<logger message="Post Payload Flow with vars: #[flowVars]" level="DEBUG" doc:name="Logger"/>
<set-attachment attachmentName="TenantID" value="#['${http.ext.system.tenant}']" contentType="text/plain" doc:name="Tenant ID"/>
<set-attachment attachmentName="Category" value="#[flowVars.Category]" contentType="text/plain" doc:name="Category"/>
<set-attachment attachmentName="Data" value="#[flowVars.fileContentgzip]" contentType="application/xml" doc:name="Data"/>
<scripting:component doc:name="filename attachment">
<scripting:script engine="Groovy"><![CDATA[import org.mule.message.ds.ByteArrayDataSource;
import javax.activation.DataHandler;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.zip.GZIPInputStream;
import java.util.zip.GZIPOutputStream;
String category = message.getInvocationProperty("Category")
String fileName=category + '.xml'
String attachmentName='Data'
byte[] compressed = flowVars.fileContentgzip
ByteArrayDataSource attachment = new ByteArrayDataSource(compressed, "application/xml",fileName);
message.addOutboundAttachment(attachmentName, new DataHandler(attachment))
return payload;]]></scripting:script>
</scripting:component>
<copy-attachments attachmentName="*" doc:name="All attachments together"/>
<set-payload value="#[null]" doc:name="Nullify Payload"/>
<logger message="before ingestion call: ${http.by.ingestion.basepath}, ${http.by.ingestion.host}, ${http.by.ingestion.port}" level="DEBUG" doc:name="Log Ingestion basepath, host, port"/>
<logger message="Begin Posting #[flowVars.Category] for Part: #[flowVars.partId]" level="INFO" doc:name="Begin Posting data"/>
<flow-ref name="Ingestion_with_retries_Flow" doc:name="Flow Ref Ingestion_with_retries" doc:description="retry injestion api call"/>
</sub-flow>
<flow name="Ingestion_with_retries_Flow" >
<until-successful objectStore-ref="objectStore" maxRetries="${max.retries}" deadLetterQueue-ref="Failed_Payload_To_ErrorDir_And_Notify"
failureExpression="#[(exception != null) and (exception.causedBy(java.net.ConnectException) || exception.causedBy(java.net.SocketTimeoutException) || exception.causedBy(java.net.SocketException) || exception.causedBy(java.io.IOException))]"
doc:name="Until Successful" millisBetweenRetries="${millis.between.retries}">
<processor-chain doc:name="Processor Chain">
<logger message="Posting data to Server" level="INFO" doc:name="Logger"/>
<http:request config-ref="HTTPS_Ingestion_Service_ExtSystem" path="/delivery" method="POST" doc:name="ExtSystem Data Delivery Post">
<http:request-builder>
<http:header headerName="Accept" value="${http.by.interface.version}"/>
<http:header headerName="Content-Encoding" value="gzip"/>
</http:request-builder>
<http:success-status-code-validator values="200"/>
</http:request>
<json:xml-to-json-transformer doc:name="XML to JSON"/>
<flow-ref name="Subflow_Extract_Ingestion_Response" doc:name="Extract Ingestion Response"/>
</processor-chain>
</until-successful>
</flow>
<sub-flow name="Subflow_Extract_Ingestion_Response">
<object-to-string-transformer returnClass="java.lang.String" mimeType="application/json" doc:name="Response_to_String"/>
<dw:transform-message doc:name="Extract DeliveryId">
<dw:set-payload resource="classpath:ingestion\ingestion-delivery.dwl"/>
</dw:transform-message>
<json:json-to-object-transformer returnClass="java.lang.Object" doc:name="JSON to Object"/>
<set-variable variableName="ExtSystemDeliveryID" value="#[payload.DeliveryID]" doc:name="ExtSystemDeliveryID"/>
<logger message="Delivery ID: #[payload.DeliveryID]" level="INFO" doc:name="Log Delivery Id"/>
<set-variable variableName="ExtSystemStatus" value="#[payload.Status]" doc:name="ExtSystemStatusStatus"/>
<flow-ref name="Update_DeliveryID_Category_in_Part_Flow" doc:name="Update Part with DeliveryID and Category"/>
<set-payload value="#[payload + '\n']" doc:name="Set Payload"/>
<file:outbound-endpoint path="${write.folderpath}#[flowVars.correlationId]" outputPattern="HttpResponse_IngestionIDs.txt" connector-ref="File" responseTimeout="10000" doc:name="Write Ingestion Response"/>
<logger message="Ingestion response stored at ${write.folderpath}#[flowVars.batchJobInstanceId]/#[flowVars.Category]_#[flowVars.partId].gz" level="INFO" doc:name="Log response path"/>
</sub-flow>
Mule 4.1.4 XML compressed File Upload Logic
<flow name="storeStocksFlow" doc:id="2d611c4c-edec-4b75-aa94-25474d145040" >
<http:listener doc:name="POST/payloadtest" doc:id="a5ed0fce-aa12-4e00-a68c-fe99008f1559" allowedMethods="POST" config-ref="HTTP_Listener_config" path="/payloadtest" outputMimeType="application/json">
</http:listener>
<logger level="INFO" doc:name="Logger" doc:id="61545cb4-8d94-4af8-a08e-9bfd0667b77f" message="Input json request: #[payload]"/>
<set-variable value="#[payload]" doc:name="Set Variable" doc:id="3923534b-7482-4a8c-ad46-948fda597550" variableName="origJsonInPayload"/>
<set-variable value="#[uuid()]" doc:name="Set Variable correlationId" doc:id="49798fd3-3175-44f8-9443-368b9a018207" variableName="correlationId"/>
<logger level="INFO" doc:name="Logger before transformation" doc:id="c72a768f-58c4-4947-b29c-93aa955b18a5" message="Before transformation: #[payload]"/>
<logger level="INFO" doc:name="Logger after transformation" message="Logger after transformation: #[payload]" doc:id="287ee190-47f6-4af6-982e-6f93a66cc052"/>
<!-- Tried both compressed and plain xml format both giving BAD_REQUEST error
<compression:compress doc:name="Gzip Compress" doc:id="bf8e4d8e-dbce-43f8-982a-ff68b87839c0" >
<compression:compressor >
<compression:gzip-compressor />
</compression:compressor>
</compression:compress> -->
<logger message="gzip compression completed - payload:#[payload]" level="INFO" doc:name="gZip completed" />
<set-variable variableName="fileContentgzip" value="#[payload]" doc:name="fileContentgzip" />
<set-variable variableName="TenantID" value="#['${http.ext.system.tenant}']" mimeType="application/json" doc:name="Tenant ID"/>
<set-variable variableName="Category" value="Stocks" mimeType="application/json" doc:name="Category"/>
<set-variable variableName="Data" value="#[vars.fileContentgzip]" mimeType="application/xml" doc:name="Data"/>
<logger level="INFO" doc:name="Logger" doc:id="29a22776-354e-42b0-b486-36bedcf8d6f0" message="JOB entry created in JOB table."/>
<flow-ref name="BY_API_Call_SubFlow1" doc:name="ExtSystem Ingestion API Test"/>
</flow>
<sub-flow name="API_Call_SubFlow1">
<logger message="Posting data to Server" level="INFO" doc:name="Logger" />
<http:request config-ref="HTTPS_Ingestion_Service_ExtSystem" path="/delivery" method="POST" doc:name="Ext System Data Delivery Post" outputMimeType="application/xml">
<http:body><![CDATA[#[%dw 2.0
output application/xml
input payload multipart/form-data
---
{
parts : {
Data : {
headers : {
"Content-Disposition" : {
"name" : "Data",
"filename": "Stocks.xml"
},
"Accept" : 'application/xml',
"Content-Encoding": 'gzip',
"TenantID": "xxxx-yyyy-aaaa-bbbb-ccccccc",
"Category": "Stocks"
},
content : payload
}
}
}]]]></http:body>
<http:headers ><![CDATA[#[output application/java
---
{
"Content-Type" : "application/com.ext-system.xxx_and_yyy-v1.14.17+xml"
}]]]></http:headers>
<http:response-validator >
<http:success-status-code-validator values="200" />
</http:response-validator>
</http:request>
<logger level="INFO" doc:name="Ingestion API Response" doc:id="a9ece74e-4b86-486f-9f3c-16272d1d00d1" message="Ingestion API Response: #[payload]"/>
</sub-flow>
Error logs:
0-6ffbd441-5963-11e9-8d2b-0a0027000005] org.mule.runtime.core.internal.processor.LoggerMessageProcessor: Posting data to Server
ERROR 2019-04-08 00:02:04,461 [[MuleRuntime].cpuLight.16: [adapter].storePersonFlow.CPU_LITE #427b75e6] [event: ] org.mule.runtime.core.internal.exception.OnErrorContinueHandler:
********************************************************************************
Message : HTTP POST on resource 'https://api.ext-system.com:443/xxxx/delivery' failed: bad request (400).
Error type : HTTP:BAD_REQUEST
Element : API_Call_SubFlow1/processors/1 # adapter:exposing-a-restful-resource-using-the-http-connector.xml:142 (Ext System Data Delivery Post)
Element XML : <http:request config-ref="HTTPS_Ingestion_Service_ExtSystem" path="/delivery" method="POST" doc:name="Ext System Data Delivery Post" outputMimeType="application/xml">
<http:body>#[%dw 2.0
output application/xml
input payload multipart/form-data
---
{
parts : {
Data : {
headers : {
"Content-Disposition" : {
"name" : "Data",
"filename": "Stocks.xml"
},
"Accept" : 'application/com.ext-system.xxx_and_yyy-v1.14.17+xml',
"Content-Encoding": 'gzip',
"TenantID": "xxxx-yyyy-aaaa-bbbb-ccccccc",
"Category": "Stocks"
},
content : payload
}
}
}]</http:body>
<http:headers>#[output application/xml
---
{
"Content-Type" : "application/com.ext-system.xxx_and_yyy-v1.14.17+xml"
}]</http:headers>
<http:response-validator>
<http:success-status-code-validator values="200"></http:success-status-code-validator>
</http:response-validator>
</http:request>
(set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
********************************************************************************
INFO 2019-04-08 00:02:04,466 [[MuleRuntime].cpuLight.16: [adapter].storePersonFlow.CPU_LITE #427b75e6] [event: 0-6ffbd441-5963-11e9-8d2b-0a0027000005] org.mule.runtime.core.internal.processor.LoggerMessageProcessor: In HTTP:BAD_REQUEST

The error is returned by the host because it didn't like something about the request. It is very difficult to understand what could be the problem just by reviewing the code snippets. A complete picture requeries to know the data and details of the server validation that failed.
A way to resolve this if you have a working case (the 3.9 version) is to enable HTTP wire logging (https://support.mulesoft.com/s/article/How-to-Enable-HTTP-Wire-Logging) in both versions, execute them and compare both HTTP requests. Then you can see what is different between them and adjust the Mule 4 version to match the other request.

I got the root cause, headers were missing in the request, I could trace this after enabling DEBUG in log4j2.xml as alejandro-dobniewski suggested. In my use case TenantID, Category (values for these both keys are String) and Data are the keys which are part of multipart/form-data. Data value will be gzip file content. Below json is self explanatory.
Solution:
The correct format of headers and body (importantly the parts payload json) to HTTP Request is:
<http:body ><![CDATA[#[%dw 2.0
output multipart/form-data
---
{
parts: {
TenantID : {
headers : {
"Content-Type": "text/plain"
},
content : "xxxxxxxxx"
},
Category : {
headers : {
"Content-Type": "text/plain"
},
content : "MyCategory"
},
Data: {
headers: {
"Content-Disposition": {
"name": "Data",
"filename": "MyCategory_gzip"
},
"Content-Type": payload.^mimeType,
},
content: payload
}
}
}]]]></http:body>
<http:headers ><![CDATA[#[output application/java
---
{
"Accept" : "application/com.xxxxx-v1.1+xml",
"Content-Encoding" : "gzip"
}]]]></http:headers>

Related

Blazorise RTE validations

I have a rich text editor from blazorise and I am trying to validate the input a user is writing. Specifically to not be empty. I have created a TextEdit invisible component in order to get that message, but it seems that my validation on textedit to not be triggered. I have a fluentvalidations check, and is not a form.
Here is what I am doing:
<FieldLabel>Viber message:</FieldLabel>
<RichTextEdit #ref="rteViberBody"
ContentChanged="#MyCheckRTEMethod"
PlaceHolder="Type your message here..."
ReadOnly="#readOnly"
SubmitOnEnter="false"
Style="height:80px">
<Editor>
</Editor>
<Toolbar>
<RichTextEditToolbarGroup Float="Float.End">
<RichTextEditToolbarButton Action="RichTextEditAction.Bold" />
<RichTextEditToolbarButton Action="RichTextEditAction.Italic" />
<RichTextEditToolbarButton Action="RichTextEditAction.Strike" />
<RichTextEditToolbarButton Action="RichTextEditAction.Image" />
<RichTextEditToolbarButton Action="RichTextEditAction.Clean" />
</RichTextEditToolbarGroup>
</Toolbar>
</RichTextEdit>
</Field>
</Fields>
<Fields>
<Field>
<Validation AsyncValidator="#ValidateRTEViberInputAsync">
<TextEdit #bind-Text="viberMessageHiddenForValidation" #bind-Text:event="oninput" Visibility="Visibility.Invisible">
<Feedback>
<ValidationError>Please type a message.</ValidationError>
</Feedback>
</TextEdit>
</Validation>
</Field>
Has someone has a clue of what needs to be done or change in order to make it work?
Thanks in advance
I've just found that this is proposed: https://github.com/Megabit/Blazorise/issues/1792
So far there's a workaround but not very nice:
<RichTextEdit #ref="richTextEditRef"
ConfigureQuillJsMethod="blazoriseDemo.configureQuillJs"
ContentChanged="#OnContentChanged"
Border="#( string.IsNullOrWhiteSpace( contentAsText ) ? Border.Danger : null)">
...
</RichTextEdit>
#if ( string.IsNullOrWhiteSpace( contentAsText ) )
{
<Paragraph TextColor="TextColor.Danger">Text is empty</Paragraph>
}
else
{
<Paragraph TextColor="TextColor.Success">Text is valid</Paragraph>
}
#code{
private string contentAsText;
public async Task OnContentChanged()
{
contentAsText = await richTextEditRef.GetTextAsync();
}
}

WSO2 Payload Factory not working as expected

I am having troubles configuring the Payload Factory to make it work as expected.
I am trying to process the response of a backend service, which is:
<Documents xmlns="http://ws.wso2.org/dataservice">
<Document>
<Data>
{ "_id" : { "$oid" : "5bbce6ec9e0aae7e5c3a150a"} , "Plan" : "XXXX"}
</Data>
</Document>
<Document>
<Data>
{ "_id" : { "$oid" : "5bbce7279e0aae7e5c3a150b"} , "Plan" : "YYYY"}
</Data>
</Document>
</Documents>
I need to extract the json in each Data tag and construct a JSON that looks something like:
{
Data:
{
_id: {...},
Plan: ...
}
}
Just for testing purposes, I was trying to use the Payload Factory Mediator to get all Data tags using XPath. This is the outSequence of my API (again, the response that comes from the backend is just like the one above):
<outSequence>
<payloadFactory media-type="xml">
<format>
<newTestData>$1</newTestData>
</format>
<args>
<arg evaluator="xml" expression="//Data" />
</args>
</payloadFactory>
<log level="full" />
</outSequence>
The problem is that the log shows that the newTestData tag is empty after the Payload Factory process the response message.
The XPath was tested in an XPath Online Tester and it is correct so: what am I doing wrong?

Issues with https and Service Fabric

While trying to implement https on our ServiceFabric backend, following this guide, I found my local cluster throwing following error:
There was an error during CodePackage activation.The service host
terminated with exit code:1
and sometimes the following would pop up:
fabric:/Dev.Project/Api is not ready, 1
partitions remaining. => Something is taking too long, the application
is still not ready.
In the Diagnostic Events I find this
{
"Timestamp": "2018-08-20T12:01:21.6423999+02:00",
"ProviderName": "Microsoft-ServiceFabric",
"Id": 23083,
"Message": "ApplicationHostTerminated: ApplicationId=fabric:/Dev.Project, ServiceName=fabric:/Dev.Project/Api, ServicePackageName=ApiPkg, ServicePackageActivationId=1d4426dc-3d5e-41ea-aa44-6e9794ff7c69, IsExclusive=True, CodePackageName=Code, EntryPointType=Exe, ExeName=Setup.bat, ProcessId=29024, HostId=77c22be7-b993-46c6-92dd-7c7ed0c8af1c, ExitCode=1, UnexpectedTermination=True, StartTime=12:01:21.568939 (151,099.982 MSec). ",
"ProcessId": 25768,
"Level": "Informational",
"Keywords": "0x4000000000000001",
"EventName": "Hosting",
"ActivityID": null,
"RelatedActivityID": null,
"Payload": {
"eventInstanceId": "6dbdc967-85ed-429a-ae6a-939aeeea1e3f",
"applicationName": "fabric:/Dev.Project",
"ServiceName": "fabric:/Dev.Project/Api",
"ServicePackageName": "ApiPkg",
"ServicePackageActivationId": "1d4426dc-3d5e-41ea-aa44-6e9794ff7c69",
"IsExclusive": true,
"CodePackageName": "Code",
"EntryPointType": 1,
"ExeName": "Setup.bat",
"ProcessId": 29024,
"HostId": "77c22be7-b993-46c6-92dd-7c7ed0c8af1c",
"ExitCode": 1,
"UnexpectedTermination": true,
"StartTime": "2018-08-20T12:01:21.5689395+02:00"
}
}
The Id being the Id of my Api-service. It probably has something to do with the Setup.bat that's not being executed correctly, or errors are thrown while running it, but I can't figure out what to do about it. As stated in the guide, I added the
<Principals>
<Users>
<User Name="SetupAdminUser">
<MemberOf>
<SystemGroup Name="Administrators" />
</MemberOf>
</User>
</Users>
</Principals>
part, and added the RunAsPolicy to the service as well
<ServiceManifestImport>
<ServiceManifestRef ServiceManifestName="ApiPkg" ServiceManifestVersion="1.0.0" />
<ConfigOverrides>
<ConfigOverride Name="Config">
...
</ConfigOverride>
</ConfigOverrides>
<EnvironmentOverrides CodePackageRef="Code">
...
</EnvironmentOverrides>
<Policies>
<RunAsPolicy CodePackageRef="Code" UserRef="SetupAdminUser" EntryPointType="Setup" />
</Policies>
</ServiceManifestImport>
I searched all over, but I can't find what the problem is. Any input is welcome!
Thanks in advance!
This works for me.
I have added LocalSystem instead of Admin user.
<Policies>
<RunAsPolicy CodePackageRef="Code" UserRef="SetupLocalSystem" EntryPointType="Setup" />
</Policies>
<Principals>
<Users>
<User Name="SetupLocalSystem" AccountType="LocalSystem" />
</Users>
Also take care of DNS name in SetCertAccess.ps1 file, it should be same as cluster name.
$subject="your cluster DNS name"

FetchXML Localization

I have a FetchXML Query that returns the correct entities for my portal.
How do I get the translated values stored in my CRM
<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="true">
<entity name="testentity">
<attribute name="xyz_testclassification" />
<attribute name="xyz_schemaname" />
</entity>
</fetch>
Working with XML and assuming attribute "xyz_testclassification" is an option set type your FetchXML query could return a resultset like this:
<resultset morerecords="0">
<result>
<xyz_testclassification name="Option One" formattedvalue="10003">10003</xyz_testclassification><xyz_schemaname>One</xyz_schemaname>
</result>
<result />
<result>
<xyz_testclassification name="Option Two" formattedvalue="10004">10004</xyz_testclassification><xyz_schemaname>Two</xyz_schemaname>
</result>
<result>
<xyz_testclassification name="Option Three" formattedvalue="10001">10001</xyz_testclassification><xyz_schemaname>Three</xyz_schemaname>
</result>
</resultset>
Here XML attribute "name" contains the display name of the option value. Attribute "formattedvalue" is only useful for numeric attributes (int, decimal, double, money).
When you are using FetchXML in C#, method IOrganizationService.RetrieveMultiple will return Entity objects. The Entity class has a FormattedValues collection containing the display values.
All values are returned according to the language and formatting settings of the user on behalf of whom the system is queried.
I agree with Henk van Boeijen. I'd like to add that if you using the Web Api endpoint it is also possible by adding "Prefer: odata.include-annotations="OData.Community.Display.V1.FormattedValue" in the header of the request.
HTTP/1.1 200 OK
Content-Type: application/json; odata.metadata=minimal
OData-Version: 4.0
Preference-Applied: odata.include-annotations="OData.Community.Display.V1.FormattedValue"
{
"#odata.context": "[Organization URI]/api/data/v8.2/$metadata#accounts(name,donotpostalmail,accountratingcode,numberofemployees,revenue)",
"value": [
{
"#odata.etag": "W/"502170"",
"name": "Fourth Coffee (sample)",
"donotpostalmail#OData.Community.Display.V1.FormattedValue": "Allow",
"donotpostalmail": false,
"accountratingcode#OData.Community.Display.V1.FormattedValue": "Default Value",
"accountratingcode": 1,
"numberofemployees#OData.Community.Display.V1.FormattedValue": "9,500",
"numberofemployees": 9500,
"revenue#OData.Community.Display.V1.FormattedValue": "$100,000.00",
"revenue": 100000,
"accountid": "89390c24-9c72-e511-80d4-00155d2a68d1",
"transactioncurrencyid_value": "50b6dd7b-f16d-e511-80d0-00155db07cb1" } ]
}
For more details: https://msdn.microsoft.com/en-us/library/gg334767.aspx

How could I properly configure Solr to index my Oracle database?

I've been trying to configure Solr to work with my Oracle 11.2 database as a datasource but nothing works. I have thoroughly explored the documentations and it seems to lack a good and working guide.
For a simple scenario, I want to index my single table [topic]
The structure of my table topic is shown below:
ID (autonumber)
Topic (varchar 50) I want to index this
Info (varchar 255) I want to index this
My solr configurations (so far)
I have added a new collection for this Oracle, name it "oracle_test". So I configure the folder structure as guided by the official documentation for this collection as follows:
~/solr/server/solr/
oracle_test
conf
data-config.xml
elevate.xml
schema.xml
solrconfig.xml
data-config.xml
I have configured a working datasource connection string to my Oracle, specified the query to my topic table, also the fields I want Solr to look up.
<dataConfig>
<dataSource name="jdbc" driver="oracle.jdbc.driver.OracleDriver" url="jdbc:oracle:system#//127.0.0.1:1521/orcl/" user="system" password="*****"/>
<document>
<entity name="help" query="select \"topic\",\"info\" from \"topic\"" dataSource="jdbc">
<field column="topic" name="topic"/>
<field column="info" name="info"/>
</entity>
</document>
</dataConfig>
schema.xml
I put the definitions of fields here.
<schema name="oracle_help" version="1.1">
<fieldType name="string" class="solr.StrField"/>
<field name="topic" type="string" indexed="true" stored="true" multiValued="false"/>
<defaultSearchField>info</defaultSearchField>
<field name="topic" type="string" indexed="true" stored="true"/>
<field name="info" type="string" indexed="true" stored="true"/>
</schema>
solrconfig.xml
Since the configuration file is big and it includes everything. I will take only some excerpts from this configuration file which is related to the Oracle configuration as follows:
I specify which field (topic) I want it to index:
<initParams path="/update/**,/query,/select,/tvrh,/elevate,/spell,/browse">
<lst name="defaults">
<str name="df">topic</str>
</lst>
</initParams>
Under processor section, I have only one default field type defined as string:
<processor class="solr.AddSchemaFieldsUpdateProcessorFactory">
<str name="defaultFieldType">strings</str>
...
</processor>
Then I tried importing the datasource via Solr Admin
Using "DataImport" on Solr Admin dashboard, once execute the command, I got this response back which I'm not sure whether it correctly indexed my Oracle table:
{
"responseHeader": {
"status": 0,
"QTime": 1
},
"initArgs": [
"defaults",
[
"config",
"data-config.xml"
]
],
"command": "status",
"status": "idle",
"importResponse": "",
"statusMessages": {}
}
Weird thing is, the status is indicated as "idle".
I tried to execute search query, but it returns error
Use the search query "test" as follows:
$> curl http://localhost:8983/solr/oracle_test/select?q=test&wt=json&indent=true
Solr returns me "undefined field topic".
{
"responseHeader": {
"status": 400,
"QTime": 1,
"params": {
"q": "called",
"indent": "true",
"wt": "json",
"_": "1434341618019"
}
},
"error": {
"msg": "undefined field topic",
"code": 400
}
}
But, as shown at the earlier part, obviously I have already defined the field "topic" in my schema.xml. It seems lack of documentation or guide on Solr official sites and I tried doing some research on Internet, but I've got nothing at all.
Can anybody who might be familiar with Solr - Oracle integration please help me figure this out? Any suggestion?
I think your Solr server should generate an error upon start-up or when you are trying to access the index that uses the schema.xml you have defined. Please have a look into the logs of your solr server.
It has formal errors, these prevent the index from starting and in turn the DIH you have defined from running
<types /> is missing around your field types
<fields /> is missing around your fields
<defaultSearchField /> is missplaced inside your fields
you have defined the field named topic twice
The structure of a schema.xml is documented in Solr's Wiki. A valid version of your schema.xml would look like the sample below.
<schema name="oracle_help" version="1.1">
<types>
<fieldType name="string" class="solr.StrField"/>
</types>
<defaultSearchField>info</defaultSearchField>
<fields>
<field name="topic" type="string" indexed="true" stored="true" multiValued="false"/>
<field name="info" type="string" indexed="true" stored="true"/>
</fields>
</schema>
The approach which you are using seems to be fine. In solrConfig.xml try using below line
<str name="config">/path/to/my/DIHconfigfile.xml</str>
instead of <str name="df">topic</str>

Resources