Using Web API and .NET Core and when I return an XML result I get the following:
<Test xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Response>
<Status>0</Status>
</Response>
</Test>
How do I remove the namespaces?
Here's part of my startup.cs file
services
.AddMvc(config =>
{
// Add XML Content Negotiation
config.RespectBrowserAcceptHeader = true;
config.InputFormatters.Add(new XmlSerializerInputFormatter());
config.OutputFormatters.Add(new XmlSerializerOutputFormatter(new XmlWriterSettings
{
OmitXmlDeclaration = true,
}));
})
.AddXmlDataContractSerializerFormatters()
.AddXmlSerializerFormatters()
.AddMvcOptions(opt => opt.FormatterMappings.SetMediaTypeMappingForFormat("xml", new MediaTypeHeaderValue("application/xml")));
Both XmlSerializer and DataContractSerializer formatters look for Content-Type or Accept header values for application/xml and text/xml. So if you register both the first formatters its possible that only one formatter in them would take effect.
If you want to support XmlSerializer, then register only that formatter and if you want to support both for some reason, make sure to change the content-types they look for. Example: application/xml-dcs, text/xml-dcs, application/xml-xmlser, text/xml-ser
In case of above issue, DataContractSerializerOutputFormatter is probably writing out the response and hence you are not seeing it take effect.
Related
I have implemented content negotiation so that a specific serializer will be used based on the accept header:
XmlFormatter fmtXml = new XmlFormatter();
fmtXml.SupportedMediaTypes.Add(new
System.Net.Http.Headers.MediaTypeHeaderValue("text/xml"));
JsonFormatter fmtJson = new JsonFormatter();
fmtJson.SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json"));
config.Formatters.Insert(0, fmtJson);
config.Formatters.Insert(0, fmtXml);
I need to allow a client to specify the desired format using a url parameter, which would take precedence over the accept header.
To do this, I've started subclassing the DefaultContentNegogiator (although I don't know that it's the best idea.:
public class CustomContentNegotiator : DefaultContentNegotiator
{
public override ContentNegotiationResult Negotiate(Type type, HttpRequestMessage request, IEnumerable<MediaTypeFormatter> formatters)
{
string sMimeType = HttpUtility.ParseQueryString(request.Url.Query).Get("_format");
if (!string.IsNullOrEmpty(sMimeType))
{
...
}
else
{
return base.Negotiate(type, request, formatters);
}
}
}
Then I replace the default content negotiator with mine:
GlobalConfiguration.Configuration.Services.Replace(typeof(IContentNegotiator), new CustomContentNegotiator());
The idea with the custom content negotiator is that if a content format has been specified as a url parameter, I would locate the formatter that matches, otherwise I would just fallback to the behavior of the DefaultContentNegotiator.
I'm just not sure how to match correctly on the supported media types, or if there is a better, simpler solution to this...
I determined that using a custom content negotiator was a red herring. Instead I was able to use a MediaTypeMapping which matches against a specific url parameter instead of the accept request header:
fmtJson.MediaTypeMappings.Add(new System.Net.Http.Formatting.QueryStringMapping("_format", "json", "application/json"));
We are busy developing an interface between Acumatica and our application via the web services. We are developing it in Ruby using the Savon gem.
We've got some of the exports working for the information we need, like this one for Vendor data:
We post the following SOAP call (after logging in):
<?xml version="1.0"?>
<env:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:tns="http://www.acumatica.com/typed/" xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
<env:Body>
<tns:Export>
<tns:commands>
<tns:Command>
<tns:FieldName>AcctCD</tns:FieldName>
<tns:ObjectName>BAccount</tns:ObjectName>
<tns:Value>Account code</tns:Value>
</tns:Command>
<tns:Command>
<tns:FieldName>AcctName</tns:FieldName>
<tns:ObjectName>BAccount</tns:ObjectName>
<tns:Value>Account name</tns:Value>
</tns:Command>
</tns:commands>
<tns:filters/>
<tns:startRow>0</tns:startRow>
<tns:topCount>0</tns:topCount>
<tns:includeHeaders>false</tns:includeHeaders>
<tns:breakOnError>false</tns:breakOnError>
</tns:Export>
</env:Body>
</env:Envelope>
to the testing endpoint:
http://p3.tryacumatica.com/(W(10003))/Soap/AP303000.asmx?WSDL
We are also able to do the same kind of thing for Inventory and Sites. However we're struggling to get it working for Purchase Orders.
We post the following:
<?xml version="1.0"?>
<env:Envelope xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:tns="http://www.acumatica.com/typed/" xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
<env:Body>
<tns:Export>
<tns:commands>
<tns:Command>
<tns:FieldName>Type</tns:FieldName>
<tns:ObjectName>POOrder</tns:ObjectName>
<tns:Value>Type</tns:Value>
</tns:Command>
<tns:Command>
<tns:FieldName>OrderNbr</tns:FieldName>
<tns:ObjectName>POOrder</tns:ObjectName>
<tns:Value>Order number</tns:Value>
</tns:Command>
</tns:commands>
<tns:filters/>
<tns:startRow>0</tns:startRow>
<tns:topCount>0</tns:topCount>
<tns:includeHeaders>false</tns:includeHeaders>
<tns:breakOnError>false</tns:breakOnError>
</tns:Export>
</env:Body>
</env:Envelope>
to the testing endpoint:
http://p3.tryacumatica.com/(W(3))/Soap/PO301000.asmx?WSDL
We always just get an empty respone. Any ideas?
Check to see if PO module is turned on. Also check to see if the user has rights to query the PO objects. Does the database receive a query? One option is debug from the database side. See if the database picks up an queries submitted from the web service input.
I know these are simple ideas, but it's worth a look.
Well, my first question is why do you need a FULL list PO ? it could be a huge data.
PO has a composite key - Type, OrderNbr and basically using Export command you have to specify value like EveryOrderNbr
see link below
link
+
old c# example for SO
Content SO301000 = context.GetSchema();
context.Clear();
string[][] data = context.Export(new Command[]
{
SO301000.OrderSummary.ServiceCommands.EveryOrderType,
SO301000.OrderSummary.ServiceCommands.EveryOrderNbr,
SO301000.OrderSummary.OrderType,
SO301000.OrderSummary.OrderNbr,
SO301000.OrderSummary.Description,
SO301000.OrderSummary.Hold,
}, null,
new Filter[]
{
new Filter { Field = SO301000.OrderSummary.OrderType, Condition = FilterCondition.Equals, Value = "IN", Operator = FilterOperator.And },
new Filter { Field = SO301000.OrderSummary.OrderNbr, Condition = FilterCondition.Equals, Value = "23630843", Operator = FilterOperator.And }
},
0, false, true);
The JAX-RS web service I'm calling is throwing xml content as text/html content type. On my side, I need to read the xml, and convert it to Java object.
The problem is: the response xml isn't formatted right and has newline characters in wrong places, such as - there are several newline characters before <?xml version="1.0" encoding="UTF-8"?> . This is causing problems trying to unmarshall it.
Is there a way I can unmarshall the response xml string though it has formatting problems?
Thanks in advance.
HttpGet httpGet = new HttpGet(uri);
HttpResponse response = client.execute(httpGet);
InputStream inputStream = response.getEntity().getContent();
JAXBContext context = JAXBContext.newInstance(MyClass.class);
MyClass myObj = (MyClass) context.createUnmarshaller().unmarshal(inputStream);
How about trimming the content before unmarshalling:
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang.CharEncoding;
MyClass myObj = MyClass.class.cast(
context.createUnmarshaller().unmarshal(
IOUtils.toInputStream(
IOUtils.toString(
InputStream.class.cast(response.getEntity().getContent()),
CharEncoding.UTF_8
).trim(),
CharEncoding.UTF_8
)
)
);
Finally got the provider to send the response in xml/application format. However, as a way out, an option that I found is to save the content as a file, and then again retrieve content from the file.
during my Camel routes, I query a server (a HTTP GET) and as a result, I receive a 200 OK with a XML body looking similar like this:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<userProfiles xmlns="http://www.mycompany.com/AEContext/xmldata">
<userProfile name="guest">
<userProfileAttributes>
<userProfileAttribute name="parameter1" value="data1" nameVisibility="ALL"/>
<userProfileAttribute name="parameter2" value="data2" nameVisibility="ALL"/>
<userProfileAttribute name="parameter3" value="data3" nameVisibility="ALL"/>
</userProfileAttributes>
</userProfile>
</userProfiles>
Any idea how I would be able to get the value of "parameter2" in the XML part (in my example 'data2') and store that value in an exchange property ? I guess by using an xpath expression ? Or ...
Thanks for your help.
An easy way to retrieve the value would be to use the XPath Language. It will allow you to extract the data you want and set it somewhere (header, body , ...). Here is how to set a parameter2 header with the value:
<setHeader headerName="parameter2">
<xpath resultType="java.lang.String">
/userProfiles/userProfile/userProfileAttributes/userProfileAttribute[2]/#value
</xpath>
</setHeader>
Using Java DSL
An example using the Java DSL and setting the message body:
final Namespaces ns = new Namespaces("c", "http://www.mycompany.com/AEContext/xmldata");
// existing code
from(...)
.setBody(
ns.xpath(
"/c:userProfiles/userProfile/userProfileAttributes/userProfileAttribute[2]/#value",
String.class)
)
.to(...);
I have a simple implementation of custom protocol. It's said that newURI method takes 3 arguments (spec, charset & baseURI) and "if the protocol has no concept of relative URIs, third parameter is ignored".
So i open a page like this tada://domain/samplepage which has XML starting with this:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE Product SYSTEM "product.dtd">
But i don't see any request regarding product.dtd to my protocol (newURI is not even called). Do i miss smth in my implementation?
BTW: the page itself opens correctly, but there's no request to the DTD-file.
const
Cc = Components.classes,
Ci = Components.interfaces,
Cr = Components.results,
Cu = Components.utils,
nsIProtocolHandler = Ci.nsIProtocolHandler;
Cu.import("resource://gre/modules/XPCOMUtils.jsm");
function TadaProtocol() {
}
TadaProtocol.prototype = {
scheme: "tada",
protocolFlags: nsIProtocolHandler.URI_DANGEROUS_TO_LOAD,
newURI: function(aSpec, aOriginCharset, aBaseURI) {
let uri = Cc["#mozilla.org/network/simple-uri;1"].createInstance(Ci.nsIURI);
uri.spec = (aBaseURI === null)
? aSpec
: aBaseURI.resolve(aSpec);
return uri;
},
newChannel: function(aURI) {
let
ioService = Cc["#mozilla.org/network/io-service;1"].getService(Ci.nsIIOService),
uri = ioService.newURI("chrome://my-extension/content/about/product.xml", null, null);
return ioService.newChannelFromURI(uri);
},
classDescription: "Sample Protocol Handler",
contractID: "#mozilla.org/network/protocol;1?name=tada",
classID: Components.ID('{1BC90DA3-5450-4FAF-B6FF-F110BB73A5EB}'),
QueryInterface: XPCOMUtils.generateQI([Ci.nsIProtocolHandler])
}
let NSGetFactory = XPCOMUtils.generateNSGetFactory([TadaProtocol]);
The channel you return from newChannel has the chrome:// URI you passed to newChannelFromURI as its URI. So that's the URI the page has as its URI, and as its base URI. So the DTD load happens from "chrome://my-extension/content/about/product.dtd" directly.
What you probably want to do is to set aURI as the originalURI on the channel you return from newChannel.
As Boris mentioned in his answer, your protocol implementation doesn't set nsIChannel.originalURI property so that URLs will be resolved relative to the chrome: URL and not relative to your tada: URL. There is a second issue with your code however: in Firefox loading external DTDs only works with chrome: URLs, this check is hardcoded. There is a limited number of supported DTDs that are mapped to local files (various HTML doctypes) but that's it - Gecko doesn't support random URLs in <!DOCTYPE>. You can see the current logic in the source code. The relevant bug is bug 22942 which isn't going to be fixed.
Boris and Wladimir, thank you!
After some time i have a solution. The problem was that the DTD-file could not be loaded from my custom-created protocol. The idea was to use Proxy API to override schemeIs() method, which was called in newURI method of nsIProtocolHandler.
So now i have this snippet of code in newURI method:
let standardUrl = Cc["#mozilla.org/network/standard-url;1"].createInstance(Ci.nsIStandardURL);
standardUrl.init(standardUrl.URLTYPE_STANDARD, -1, spec, charset, baseURI);
standardUrl.QueryInterface(Ci.nsIURL);
return Proxy.create(proxyHandlerMaker(standardUrl));
proxyHandlerMaker just implements Proxy API and overrides the needed schemeIs() method. This solved the problem and now all the requests come to newChannel where we can handle them.
Important notes:
Request to DTD comes to newURI() method and does not come to newChannel(). This is the default behavior. This happens because schemeIs("chrome") method is called on the object which was returned by newURI() method. This method should return "true" for DTD-requests if you want the request to reach the newChannel() method.
newChannel() method is invoked with the {nsIURI} object which is not the same as the object which was returned by the newURI method.
If you want to handle both protocol:page & protocol://domain/page URLs by your protocol, you should use both {nsIURI} and {nsIStandardURL} objects
You can pass the created {nsIStandardUrl}-object (standardUrl in the snippet above) as a 2nd argument to the Proxy.create() function. This will make your baseURI (3rd arguments in newURI) pass "baseURI instanceof nsIStandardUrl" check. SchemeIs() method of this proxied object will also return true for the DTD-files requests. But unfortunately the requests won't reach newChannel() method. This could be a nice DTD-problem solution but I can't solve this problem.