I have to send Id and owner id two fields from a flat file to http transformation which hits a web service(rest API).They need the data to be sent as below json format
[
{
"Id":"000xxxvvbnh",
"Ownerid":"xxxvvv1b5dmk"
}
]
how do I pass this two fields in json format as one request to the web service?
And also I need to create multiple session doing same operation parallelly hitting webservice. Is target webservice or we need to create a target flat file to capture the success or failure response?
Use http transformation to send json data to api.
First of all, read data using SQ. Using expression transformation remove [,] brackets.
Then, create a http transformation, create 1 input port as inp_id_ownerid(string) and content type(default). Then attach expression output to this input.
http will create default output port HTTPOUT and you can use this to capture return data of api.
Mention the api URL correctly. Test the URL first using swagger first if needed.
Mapping should look like this -
SQ --> EXP--> HTTP-->EXP-->TGT
Related
please suggest me how to maintain json POST data(for API testing) for multiple test cases in robotframework based on your past experience. Whether to maintain it in excel and separate json files?
If you have a "template" JSON and need to parameterize individual values you can use CSV Data Set Config for supplying these values, you request would be something like:
{
"parameter1": "${value1}",
"parameter2": "${value2}"
}
where these ${value1} and ${value2} will be populated from CSV data set config
If you need to send totally different JSON with each request it will make sense to put all the payload files into some folder and then:
Use Directory Listing Config to read the file paths into JMeter Variables
Use __FileToString() function for reading the file content into the HTTP Request body data
It's better to store POST data as separate JSON file. Here is how I would store:
- testdata
- json
- sample_file1.json
- sample_file2.json
There is advanced option if you don't like to store pre-defined JSON file for each test case but rather to generate one. You can use JSON Schema to generate JSON files, all you have to do is to create schema and generate JSON files based on that schema.
I want to send a dataflow to a webservice which will then respond based on the content of the dataflow. So it might respond with a JSON that says {'Errors':'None'} or {'Errors':'5'}. I would like to then use this information to continue processing the original dataflow if no errors are found, orlog the information if any errors are found. My question is, how can I route based on the values in the JSON response?
You can use EvaluateJsonPath to get the value of Errors into an attribute, then RouteOnAttribute to send the different values down different paths.
Alternatively you could use QueryRecord with queries like SELECT * FROM FLOWFILE WHERE Errors = 'None'. Each query gets its own downstream relationship, so it has the effect of routing in this case.
I'm using Jmeter to test my web server, instead of using the jmeter's default
response time, I'd rather to use one specific header value that I extracted from response header by Regular Expression Extractor, how can Jmeter generate a table or graph base on that value(think that value as the real response time)
Use sample_Variable to generate the required data with simple data writer as csv. Then, replace the response time data generated in that sheet with your custom variable data in the sheet. Then, browser that file in the listeners as per your requirement.
i have created a scenario where i log in and extract all the device IDs added in my application. After extracting all device IDs with "JSON Path Extractor" i want to apply one configuration to all devices, using device IDs.
in JSON Path Extractor i am using JSON path as:
$..deviceResponseList[*].id[0]
and Variable name: device_id
in next http request i am calling variable as:
${device_id}
Here if it extracts say 10 device IDs in last Http request its taking only first ID in Http request.
How should i pass every device_id extracted one after other from json path extractor in next http request sampler
NOTE: I am using Rest API for implementation.
This is not possible OOTB with jmeter-plugins.
But you can have a look at :
http://www.ubik-ingenierie.com/blog/easy-scripting-of-json-applications-with-apache-jmeter/
Use our ULP_JSON PostProcessor to extract from the JSON data the elements we want using the variable “data” extracted by previous Post-Processor...
Disclaimer : We are providers of this commercial plugin for Apache JMeter
I would like to read the exact value of a variable I use to pass through an HTTP Request. I first read in many values of variables using the CSV Data Set Config. For the username, it is in the form of an email address. So, I have a variable called "email" in the Data Set Config. In the actual HTTP Request, for "name", I call it "username". For the "Value" field for this same "username", I added a time() function to it like this so I would end up creating unique users in my tests:
${email}${__time()
When I view the "Request" in a View Results Tree, I can see my parameter is listed correctly:
username=email1%40email.com1390854377360
I do not care if this is correct in real world terms. I already know that is not a valid email. That is ok for now.
What I want to know is how can I log that email that I just created on the fly? I would like to not have to pull in the whole request every time also and then use some type of Regular Expression extractor. It just seems like there should be an easy way to do this.
I think there are 2 ways,
Beanshell Pre/Post processors : you can write custom code in which you can log all your variables in some custom log file
Simple data writer : you can configure it and check save url,save field names,save response data field checkboxes that will give you complete data but in that later postprocessing on result file is required to get all usernames (email in your case).
first approach is easier and allows you create your own logging format(easy to retrieve and use somewhere else).
second approach is somewhat tedious and requires post processing.