I have installed WSO2 EI 6.1.1. I have created an http proxy and I am creating a sequence that obtains certain values from the request and processes it through a Java Class mediator. I have succesfully obtained the parameters from the URL in a property mediator by defining the expresion as $url:token.
I am trying to obtain also the http method (Get, Post, Put...) and the body of the request but I cannot find the correct XPath variables that define them.
This is an example of a request I want to capture (I want PUT and the JSON data)
PUT path?token=aaaa HTTP/1.1
Content-Length: 28
Host: xx.xx.xx.xx
Content-Type: application/json
{
"id": 14,
"value": "+02"
}
It seems that the values are realted to the $trp and $body objects, but I have not been able to find any reference on how to obtain them.
UPDATE:
Defining $body as the expresion gives me the following content:
<soapenv:Body xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"/>
Try these.
Body:
<property name="payload" expression="$body"/>
HTTP Verb:
<property name="verb" expression="$axis2:HTTP_METHOD"/>
or
<property name="verb" expression="$ctx:HTTP_METHOD"/>
Related
I'm trying to train a Form Recognizer using the browser API console (https://eastus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api/operations/TrainCustomModel/console). I've uploaded traning images to a container and created an SAS. The browser API console generate following HTTP request:
POST https://eastus.api.cognitive.microsoft.com/formrecognizer/v1.0-preview/custom/train?source=https://pythonimages.blob.core.windows.net/?sv=2019-02-02&ss=bfqt&srt=sco&sp=rl&se=2020-01-22T00:23:33Z&st=2020-01-21T16:23:33Z&spr=https&sig=••••••••••••••••••••••••••••••••&prefix=images HTTP/1.1
Host: eastus.api.cognitive.microsoft.com
Content-Type: application/json
Ocp-Apim-Subscription-Key: ••••••••••••••••••••••••••••••••
{
"source": "string",
"sourceFilter": {
"prefix": "string",
"includeSubFolders": true
}
}
However, the answer I get back is
Transfer-Encoding: chunked
x-envoy-upstream-service-time: 4
apim-request-id: 5ad37aa2-e251-4b61-98ae-023930b47d27
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
x-content-type-options: nosniff
Date: Tue, 21 Jan 2020 16:25:03 GMT
Content-Type: application/json; charset=utf-8
{
"error": {
"code": "1004",
"message": "Dataset path must be relative to local input mount path '/input' if local data is referenced."
}
}
I don't understand why it seems to be looking for data locally. I've experimented with the SAS, e.g. including the container name (images) in the blob http address rather than as a query parameter, but no success so far.
I've also tried the Python/REST path (described here: https://learn.microsoft.com/en-gb/azure/cognitive-services/form-recognizer/quickstarts/python-train-extract-v1), which results in a different error:
Response status code: 408
Response body: {'error': {'code': '1011', 'innerError': {'requestId': 'e7f9ef9f-97bc-4b6a-86f3-0b29c9591c87'}, 'message': 'The operation exceeded allowed time limit and was canceled. The common reasons are that the data source is too large or contains unsupported content. Please check that your request conforms to service limits and retry with redacted data source.'}}
For completeness, the code I use is as follows (key/signature *ed out:)
########### Python Form Recognizer Train #############
from requests import post as http_post
# Endpoint URL
base_url = r"https://markusformsrecognizer.cognitiveservices.azure.com/" + "/formrecognizer/v1.0-preview/custom"
source = r"https://pythonimages.blob.core.windows.net/images?sv=2019-02-02&ss=bfqt&srt=sco&sp=rl&se=2020-01-22T15:37:26Z&st=2020-01-22T07:37:26Z&spr=https&sig=*********************************"
headers = {
# Request headers
'Content-Type': 'application/json',
'Ocp-Apim-Subscription-Key': '*********************************'
}
url = base_url + "/train"
body = {"source": source}
try:
resp = http_post(url = url, json = body, headers = headers)
print("Response status code: %d" % resp.status_code)
print("Response body: %s" % resp.json())
except Exception as e:
print(str(e))
For error code 1004 Please follow the below to get the Source path containing the training documents and pass as value to the source key.
{
"source": "string",
"sourceFilter": {
"prefix": "string",
"includeSubFolders": true
}
}
Replace with the Azure Blob storage container's shared access signature (SAS) URL. To retrieve the SAS URL, open the Microsoft Azure Storage Explorer, right-click your container, and select Get shared access signature.
Make sure the Read and List permissions are checked, and click Create.
Then copy the value in the URL section. It should have the form:
https://.blob.core.windows.net/container name?SAS value.
Please use the new Form Recognizer v2.0 release it is an async API and enables training on large data sets and analyzing large documents. https://aka.ms/form-recognizer/api
quick start - https://learn.microsoft.com/en-us/azure/cognitive-services/form-recognizer/quickstarts/python-train-extract
To get started with Form Recognizer please login to the Azure Portal using this link to create a Form Recognizer resource (for v2.0 (preview) please use West US 2 or West Europe regions).
try removing the string value from prefix property.
{
"source": "string",
"sourceFilter": {
"prefix": "",
"includeSubFolders": true
}
}
The Python Quick Start code for version 2.0 seems to be working, at least I don’t get any errors anymore. I’m now feeling slightly silly that I didn’t try this earlier. The API (web-browser) console, linked from the Quick Start page of the Form Recognizer seems automatically assume I want to use version 1.0 and there’s no way to change that (or perhaps I’ve just overseen something). Hence I assumed I’d been allocated a v1.0 trial and therefore that’s what I used when I tried the Python Quick Start the first time around.
Instead of using just the SAS URI in the "source" of Request parameter on the API POST call, use the complete string of the container followed by the SAS URI token.
For ex:
https://.blob.core.windows.net//
The application I am creating takes a gzipped file sent to a RESTful PUT, unzips the file and then does further processing like so:
public class Service {
#PUT
#Path("/{filename}")
Response doPut(#Context HttpServletRequest request,
#PathParam("filename") String filename,
InputStream inputStream) {
try {
GZIPInputStream gzipInputStream = new GZIPInputStream(inputStream);
// Do Stuff with GZIPInputStream
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
}
I am able to successfully send a gzipped file in a unit test like so:
InputStream inputStream = new FileInputStream("src/main/resources/testFile.gz);
Service service = new Service();
service.doPut(mockHttpServletRequest, "testFile.gz", inputStream);
// Verify processing stuff happens
But when I build the application and attempt to CURL the same file from the src/main/resources dir with the following I get a ZipException:
curl -v -k -X PUT --user USER:Password -H "Content-Type: application/gzip" --data-binary #testFile.gz https://myapp.dev.com/testFile.gz
The exception is:
java.util.zip.ZipException: Not in GZIP format
at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:165)
at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:79)
at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:91)
at Service.doPut(Service.java:23)
// etc.
So does anyone have any idea why sending the file via CURL causes the ZipException?
Update:
I ended up taking a look at the actual bytes being sent via the InputStream and figured out where the ZipException: Not in GZIP format error was coming from. The first two bytes of a GZIP file are required to be 1F and 8B respectively in order for GZIPInputStream to recognize the data as being in GZIP format. Instead the 8B byte, along with every other byte in the steam that doesn't correspond to a valid UTF-8 character, was transformed into the bytes EF, BF, BD which are the UTF-8 unknown character replacement bytes. Thus the server is reading the GZIP data as UTF-8 characters rather than as binary and is corrupting the data.
The issue I am having now is I can't figure out where I need to change the configuration in order to get the server to treat the compressed data as binary vs UTF-8. The application uses Jax-rs on a Jersey server using Spring-Boot that is deployed in a Kubernetes pod and ran as a service, so something in the setup of one of those technologies needs to be tweaked to prevent improper encoding from being used on the data.
I have tried adding -H "Content-Encoding: gzip" to the curl command, registering the EncodingFilter.class and GZipEncoder.class in jersey ResourceConfig class, adding application/gzip to the server.compression.mime-types in application.propertes, adding the #Consumes("application/gzip") annotation to the doPut method above, and several other things I can't remember off the top of my head but nothing seems to have any effect.
I am seeing the following in the verbose CURL logs:
> PUT /src/main/resources/testFile.gz
> HOST: my.host.com
> Authorization: Basic <authorization stuff>
> User-Agent: curl/7.54.1
> Accept: */*
> Content-Encoding: gzip
> Content-Type: application/gzip
> Content-Length: 31
>
} [31 bytes data]
* upload completely sent off: 31 out of 31 bytes
< HTTP/1.1 500
< X-Application-Context: application
< Content-Type: application/json;charset=UTF-8
< Transfer-Encoding: chunked
< Date: <date stuff>
...etc
Nothing I have done has affected the receiving side
Content-Type: application/json;charset=UTF-8
portion, which I suspect is the issue.
I met the same problem and finally solved it by using -H 'Content-Type:application/json;charset=UTF-8'
Use Charles to find the difference
I can successfully send the gzipped file using Postman. So I used Charles to catch two packages sent by curl and postman respectively. After I compared these two packages, I found that Postman used application/json as Content Type while curl used text/plain.
Spring docs: Content Type and Transformation
According to Spring docs, if the content type is text/plain and the source payload is byte[], Spring will convert the payload to string using charset specified in the content-type header. That's why ZipException occurred. Since the original byte data had already been decoded and not in gzip format anymore.
Spring source code
#Override
protected Object convertFromInternal(Message<?> message, Class<?> targetClass, #Nullable Object conversionHint) {
Charset charset = getContentTypeCharset(getMimeType(message.getHeaders()));
Object payload = message.getPayload();
return (payload instanceof String ? payload : new String((byte[]) payload, charset));
}
Today I'm doing my API automation testing and performance testing with Jmeter when the server is a REST API.
Now the development changed to graphQL API, and I have two questions about it:
What is the best way to perform the automation API and performance testing?
Does Jmeter support graphQL API?
I use Apollo to build the GraphQL server, and use JMeter to query the GraphQL API as below.
1. Set up HTTP Request
2. Set up HTTP Headers
Depending on your application, you might also need to set up HTTP header Authorization for JWT web tokens, such as:
Authorization: Bearer xxxxxxxxxxxxxxxxxxxxxxxxxxxx
3. Set up HTTP Cookie if needed for your app
4. Run the test
Disclaimer: I work for LoadImpact; the company behind k6.
If you are willing to consider an alternative, I've recently written a blog post about this topic: Load testing GraphQL with k6.
This is how a k6 example looks like:
let accessToken = "YOUR_GITHUB_ACCESS_TOKEN";
let query = `
query FindFirstIssue {
repository(owner:"loadimpact", name:"k6") {
issues(first:1) {
edges {
node {
id
number
title
}
}
}
}
}`;
let headers = {
'Authorization': `Bearer ${accessToken}`,
"Content-Type": "application/json"
};
let res = http.post("https://api.github.com/graphql",
JSON.stringify({ query: query }),
{headers: headers}
);
Looking into Serving over HTTP section of the GraphQL documentation
When receiving an HTTP GET request, the GraphQL query should be specified in the "query" query string.
So you can just append your GraphQL query to your request URL.
With regards to "best practices" - you should follow "normal" recommendations for web applications and HTTP APIs testing, for example check out REST API Testing - How to Do it Right article.
You can try using easygraphql-load-tester
How it works:
easygraphql-load-tester is a node library created to make load testing on GraphQL based on the schema; it'll create a bunch of queries, that are going to be the ones used to test your server.
Examples:
Artillery.io
K6
Result:
Using this package, it was possible to me, to identify a bad implementation using dataloaders on the server.
Results without dataloaders
All virtual users finished
Summary report # 10:07:55(-0500) 2018-11-23
Scenarios launched: 5
Scenarios completed: 5
Requests completed: 295
RPS sent: 36.88
Request latency:
min: 1.6
max: 470.9
median: 32.9
p95: 233.2
p99: 410.8
Scenario counts:
GraphQL Query load test: 5 (100%)
Codes:
200: 295
Results with dataloaders
All virtual users finished
Summary report # 10:09:09(-0500) 2018-11-23
Scenarios launched: 5
Scenarios completed: 5
Requests completed: 295
RPS sent: 65.85
Request latency:
min: 1.5
max: 71.9
median: 3.3
p95: 19.4
p99: 36.2
Scenario counts:
GraphQL Query load test: 5 (100%)
Codes:
200: 295
I am testing our GraphQL Implementation, you will need:
Thread Group
HTTP Header Manager: You need to add as Content-Type: Application/json
https://i.stack.imgur.com/syXqK.png
HTTP Request: use GET and add in the Body Data your query
https://i.stack.imgur.com/MpxAb.png
Response Assertion: You want to count as correct requests only responses without errors
https://i.stack.imgur.com/eXWGs.png
A Listener:
https://i.stack.imgur.com/VOVLo.png
I have recently tried API testing with GraphQl with both GET and POST request in Jmeter
Make sure its POST request for both Query and Mutation
Example Your Graph Ql query
{
storeConfig{
default_title
copyright
}
}
For Jmeter it would be like this
{
"query":"{ storeConfig { default_title copyright } }"
}
Step up HTTP Request
In place of the localhost, your domain name will come. Make sure you don't add https
Example:- https://mydomainname.com
In Jmeter :- mydomainname.com
Setup HTTP Header Manager
For requesting Mutation in Jmeter
Example mutation in Graphql
mutation {
generateCustomerToken(
email: "rd#mailinator.com"
password: "1234567"
) {
token
}
}
In Jemeter mutation will be like this
{
"query":"mutation { generateCustomerToken( email: \"rd#mailinator.com\" password: \"1234567\" ) { token } }"
}
Replace double quotes with (\") as shown in above query
The easiest way will be to use the GraphQL queries directly in JMeter without the need to convert them to JSON.
All you need to do is to pass "Content-Type" as "application/graphql" in the header.
Image Link for: HTTP Request with GraphQL Query as input
Image Link for: Header details
I've spent some time trying to fix the elastic search bulk upload warning:
Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header
My request is below:
POST http://elasticserver/_bulk HTTP/1.1
Authorization: xxx
Content-Type: application/x-ndjson; charset=utf-8
Host: elasticserver
Content-Length: 8559
... new line delimited json content ...
And my valid response with 200 status is below:
HTTP/1.1 200 OK
Warning: 299 Elasticsearch-5.5.1-19c13d0 "Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header." "Mon, 14 Aug 2017 00:46:21 GMT"
content-type: application/json; charset=UTF-8
content-length: 4183
{"took":5538,"errors":false,...}
By experimenting I discovered that the issue is in content type charset definition Content-Type: application/x-ndjson; charset=utf-8 and if I change it to Content-Type: application/x-ndjson I get no warning.
Is it an elastic search issue or I'm forming the request incorrectly?
The official documentation explicitly states that
When sending requests to this endpoint the Content-Type header should be set to application/x-ndjson.
The RestController source code also shows that they are ignoring the charset:
final String lowercaseMediaType = restRequest.header("Content-Type").toLowerCase(Locale.ROOT);
// we also support newline delimited JSON: http://specs.okfnlabs.org/ndjson/
if (lowercaseMediaType.equals("application/x-ndjson")) {
restRequest.setXContentType(XContentType.JSON);
return true;
}
We're attempting to integrate with the QuickBooks Online V2 api using Ruby 1.9.3 (not RoR).
Using the API Explorer and the Employee endpoint documentation we were able to get a simple list of test employees by using the Google Signet OAuth Gem.
require 'signet'
require 'signet/oauth_1/client'
#intialize oauth1 client
#client = Signet::OAuth1::Client.new(
:temporary_credential_uri => "https://oauth.intuit.com/oauth/v1/get_request_token",
:authorization_uri => "https://appcenter.intuit.com/Connect/Begin",
:token_credential_uri => "https://oauth.intuit.com/oauth/v1/get_access_token",
:client_credential_key => 'qyprdPEfJqU7eOze0Fby9iYhrUS5DQ',
:client_credential_secret => 'fuXsasJo4TrTEd3Yhv4TeMUizmtguh0JioIB5r2I',
:callback => "http://localhost:3000/callback/general"
)
#client.token_credential_key = 'qyprdJUtDSk7owxVfZlq7JeWO1mtpHBkSMD5GhB02PwIC6N0'
#client.token_credential_secret = 'Rq2ekgQWWL9frZAKpcgWef291mR0J5HBE354u5F3'
#setup request
original_request = [
'POST',
'https://qbo.sbfinance.intuit.com/resource/employees/v2/791630875',
# we also tried this url 'https://qbo.intuit.com/qbo28/resource/employees/v2/791630875',
[
['Content-Type', 'application/x-www-form-urlencoded'],
],
[]
]
#execute request
response = #client.fetch_protected_resource(:request => original_request)
puts response.body
As you can see the request is pretty straight-forward.
However once we create a request with a Filter in the body, we get an HTML page with the following error: HTTP Status 401 - message=Exception authenticating OAuth; errorCode=003200; statusCode=401
#setup request
original_request = [
'POST',
'https://qbo.intuit.com/qbo28/resource/employees/v2/791630875',
#'https://qbo.sbfinance.intuit.com/resource/employees/v2/791630875',
[
['Content-Type', 'application/x-www-form-urlencoded'],
],
["Filter=Name :EQUALS: Doe"]
]
We're using the Google OAuth gem, and I've verified the signature generation to be correct using these tools: LinkedIn Oauth Test Console and Beginners guide to OAuth signing requests. They both verify that the signature that Signet is generating is correct for the body I provide.
I've looked at a few SO Questions:
QuickBooks Online querying with filter returns 401 everytime
Unable to create(POST) objects (Account, customer...) on QB Windows using IDS and Sync Manager
But nothing has worked. Any help would be appreciated, we're willing to use a third party gem such as quickeebooks but we would rather not. I assume I'm just missing something simple here.
Please provide me with the following items so I can verify a working answer:
Your request parameters, including uri, header, body and exact client and access tokens (developer app tokens only please, I'll need to verify that I can generate the exact same request, including signature)
You basestring used for generating the HMAC-SHA1 signature. it will look something like
POST&https%3A%2F%2Fqbo.intuit.com%2Fqbo28%2Fresource%2Femployees%2Fv2%2F791630875&Filter%3DName%2520%253AEQUALS%253A%2520David%2520Test%26oauth_consumer_key%3DqyprdPEfJqU7eOze0Fby9iYhrUS5DQ%26oauth_nonce%3D-1787433535548338293%26oauth_signature_method%3DHMAC-SHA1%26oauth_timestamp%3D1380089100%26oauth_token%3DqyprdJUtDSk7owxVfZlq7JeWO1mtpHBkSMD5GhB02PwIC6N0%26oauth_version%3D1.0
Your response, including header and body data
I had tried to use filter query with employee endpoint. It works fine.
EDIT - Sharing endpoint, filter and resultset related to Employee API Endpoint
https://qbo.intuit.com/qbo28/resource/employees/v2/791926875
Filter= Name :EQUALS: Manas Mukherjee
header - "Authorization: OAuth oauth_token="2eRrd7LhEtHrM1CrqWvy1kmSgeukEgFxW99E1xwhSsLCp1JB", oauth_consumer_key="qyprdXsaKh0a132eNs7NTJLufjfrzm", oauth_version="1.0", oauth_signature_method="HMAC-SHA1", oauth_timestamp="1380084612", oauth_nonce="1556081845430558974", oauth_signature="IMjh%2FTx%2F7GMFDE6WQqZK8b6apjI%3D"[\r][\n]"
Content-Type: application/x-www-form-urlencoded
Data Set
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<qbo:SearchResults xmlns="http://www.intuit.com/sb/cdm/v2" xmlns:qbp="http://www.intuit.com/sb/cdm/qbopayroll/v1" xmlns:qbo="http://www.intuit.com/sb/cdm/qbo">
<qbo:CdmCollections xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="Employees">
<Employee>
<Id idDomain="QBO">20</Id>
<SyncToken>0</SyncToken>
<MetaData>
<CreateTime>2013-09-24T21:37:22-07:00</CreateTime>
<LastUpdatedTime>2013-09-24T21:37:22-07:00</LastUpdatedTime>
</MetaData>
<Name>Manas Mukherjee</Name>
<Address>
<Line1>ABC Str</Line1>
<City>London</City>
<PostalCode>4353543</PostalCode>
<GeoCode>LAT=51.5148382,LNG=-0.1264144</GeoCode>
</Address>
<GivenName>Manas</GivenName>
<MiddleName>Kr</MiddleName>
<FamilyName>Mukherjee</FamilyName>
<ShowAs>Manas Kr Mukherjee</ShowAs>
<BillableTime>false</BillableTime>
</Employee>
</qbo:CdmCollections>
<qbo:Count>1</qbo:Count>
<qbo:CurrentPage>1</qbo:CurrentPage>
</qbo:SearchResults>
OAuth header using your tokens
"Authorization: OAuth oauth_token="qyprdJUtDSk7owxVfZlq7JeWO1mtpHBkSMD5GhB02PwIC6N0", oauth_consumer_key="qyprdPEfJqU7eOze0Fby9iYhrUS5DQ", oauth_version="1.0", oauth_signature_method="HMAC-SHA1", oauth_timestamp="1380089100", oauth_nonce="-1787433535548338293", oauth_signature="Vj67xMVhSKGjVSmGyOxt7SVv0i8%3D"[\r][\n]"
Endpoint - https://qbo.intuit.com/qbo28/resource/employees/v2/791630875
Post data to end point: Filter= Name :EQUALS: David Test
Content-Type: application/x-www-form-urlencoded
It works fine
Thanks
See this sample fiddler request with Filter for items in QBO. I cannot paste the fiddler log here. You can do it for similarly for Employee. The filters should go into the body and encode your header:
Request-
POST https://qbo.intuit.com/qbo1/resource/items/v2/723488155
HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Authorization: OAuth oauth_token="lvprdgF9q4mSQx5A6lKNm3NISXvwIpF16z",oauth_nonce="3740352e-20a4-4d45-af4f-2b783ee20e60",oauth_consumer_key="qyprd7I5WvVnPoiBh1ejZn",oauth_signature_method="HMAC-SHA1",oauth_timestamp="1377106651",oauth_version="1.0",oauth_signature="1OAJXk5uH0sEpYpdhh%2BDMzjQFEs%3D"
Host: qbo.intuit.com
Content-Length: 28
Expect: 100-continue
PageNum=1&ResultsPerPage=100
Response Header-
HTTP/1.1 200 OK
Date: Wed, 21 Aug 2013 17:37:31 GMT
Server: Apache
Set-Cookie: qboeuid=10.129.32.5.1377106651774076; path=/; expires=Thu, 21-Aug-14 17:37:31 GMT; domain=.intuit.com
Set-Cookie: JSESSIONID=82DE11473B5246497B9FDCD8A6DA4C45.c1-pprdqboas30j; Path=/; Secure; HttpOnly
Vary: Accept-Encoding
Content-Type: application/xml;charset=UTF-8
Content-Length: 32525