JMeter is puting all the response data in InfluxDB. In Chronograf I can now see the performance over time, thats working fine.
The thing that I want is not comparing the performance data over time, but comparing the performance data about an software version. So I can easy see that the deployed version of today is slower than the version of last week. So more a kind of regression test.
What is the best way to do this? Can I add some extra field to the measurements in InfluxDB with the version number / branch name? Can I do something with tags? Is InfluxDB the right solution for my "problem"?
You can consider using GraphiteBackendListenerClient which has rootMetricsPrefix field.
So you can set different rootMetricsPrefix value for different test runs against different software versions and this way distinguish metrics coming from different tests in a single dashboard.
References:
How to Use Grafana to Monitor JMeter Non-GUI Results
JMeter – Real Time Results – InfluxDB & Grafana – Part 2 – Adding Custom Fields
Related
Trying to Performance test an application developed in OJET technology. Which tool/protocol should I use for scripting? I tried HTTP/Web protocol with Jmeter and Load Runner. But that doesn't capture all the requests and responses at the javascript/browser level. Hence I am facing issues in correlating the dynamic values during test design. Hence, scripts fail during the replay. Currently trying to do it with Truclient Web protocol as an alternative. But I need to know which tool/protocol should I use for scripting?
According to OJET looks like this is a web app generator.
If you choose to start with JMeter use post-processor such as regex to catch and save every value that is needed for as arg in the next request.
Don't be afraid of these dynamic values. Try to follow next articles to get the idea.
None of tools will provide you automatic correlation without issue. Nor LoadRunner, nor Jmeter. It is always tricky.
Ask more specific questions when you start facing issue.
Jmeter catch correlations
You need to implement real user using your application with 100% accuracy in terms of network footprint
Both JMeter and/or LoadRunner are not capable of executing client-side JavaScript, the options are in:
Implement these JavaScript-driven network calls using scripting (in JMeter it will be JSR223 Test Elements)
Use a real browser, LoadRunner's Truclient protocol is basically a headless web browser, in JMeter can be integrated with Selenium browser automation framework via WebDriver Sampler
With regards to "which protocol/tool" to use:
Implementing JavaScript calls manually will take extra effort, however your test will consume less resources (CPU, RAM, etc.)
Using real browsers will take less efforts, but the test will consume much more resources (something like 1 CPU core and 2 GB of RAM per user/ browser instance) and you won't have metrics like Connect Time, Latency, etc.
LoadRunner TruClient. This will handle all of the Javascript executions and dynamic elements related to session, state, date/time, object identifiers, ... You will still need to appropriately handle user input items.
After the performance tests of a REST web service, what is the best way to report the test results.
test must be run on non-gui mode for high loads. so gui will be closed while test are executed.
The reports must be readable by customers. So, does the customers have to have Jmeter tool for analyzing the results?
In non gui mode, the listeners can be saved but when we or customers want to examine the results, the Jmeter tool must be opened. Are there any way useful?
Grafana may be used in non gui mode with InfluxDB. But the same issue is still valid.
It depends on what do "customers" want to see as the "results" and what do you mean by "the best"
It's possible to generate HTML Reporting Dashboard
It's possible to generate tables and charts using either JMeterPluginsCMD Command Line Tool and Graphs Generator Listener
It's possible to use a 3rd-party tool like Taurus which provides interactive reports in console and in the web.
Other less popular options are listed in this answer
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I got the task to test Solr performance testing. I am completely new to Solr and not having idea how to perform here testing.
Solr which we are using, it is utilizing a lot of RAM and CPU. Due to that our application is getting hang and send server error messages.
What would be the way of testing Solr, whether it is required any tool to create multiple concurrent threads?
According to the Solr Quick Start guide
Searching
Solr can be queried via REST clients, cURL, wget, Chrome POSTMAN, etc., as well as via the native clients available for many programming languages.
so you can use "usual" HTTP Request samplers to mimic multiple users concurrently using Solr.
References:
Building a Web Test Plan
Testing SOAP/REST Web Services Using JMeter
For search applications, the amount of requests by itself usually isn't as important as the query profile. There's a lot of internal caching going on, and the only useful way to be able to do decent performance testing, is to use your actual query logs to replicate the query profile that represents your users. You'll also have to use the actual data that you have in your Solr server, so you get (at least) close to the same cardinality for fields and values.
This would mean using the same filters, the same kind of queries and within the same kind of simultaneous load. Since you probably want to go above the load you see in production, using logs for several days as a single day (and be sure to get weekends vs weekdays in there, and if you have a particularly bad day, such as black fridays for ecommerce, keep those logs available so you're able to replicate that profile.
There are (many) tools to do the HTTP requests to Solr, but be sure to use a query profile and sets of queries that actually represent how you're using Solr, otherwise you're just hitting the query cache each single time, or you have data that doesn't represent the actual data in your dataset - which will give you completely irrelevant response times (i.e. random data performs a lot worse than actual english text where tokens are repeated over documents).
You can use solr meter to do the performance testing. read here solr
meter wiki
The most important thing is not how to test the queries - but putting up the scenarios that you want to test and which mimicks the real time application usage you see.
initially you need to decide what you want to find out with your test. Do you want to find bottlenecks? do you want to find out if your current setup can match business requirements? do you need to find the breaking points of your current architecture and setup?
Solr utilizing a lot of CPU is very often related to indexing and memory usage might be related to segment merging - so it sounds like you need to define your senarios.
how much content should you push to Solr for it to perform indexing?
how many queries do you need to send?
what are the features of the queries (facets, highlighting, spellcheck etc)?
You could use Jmeter to test the throughput of your search application and you could also check for IO, load, CPU usage & Ram Usage on each Solr instances.
I have developped a HTTP web service which is queried by smartphone. I want to test the performances of all this service containing :
A java server (java 6, java + play framework)
A database (Mysql 5.1.41)
A Linux (ubuntu) server (kernell 2.6.32)
I have tried leading test campaigns using python scripts with many threads or sequential tests. But it's hard to have conclusions...
I want to be able to have the maximum number of request per second for my service, the average time for each request... complete dashboards displaying a lot of information
I can do many scripts to test that but I am shure that well-known softwares permits to conduct these tests. Ideally these softwares could also display information about where I loose time ...
Do you have hints ?
Thanks for your help
Some tools I've used for HTTP benchmarking
Apache Bench
Siege
JMeter
Of these, JMeter is probably best for the situation you describe. All of these display a lot of information, but won't explain where you lose time.
For that, I'd suggest a profiler such as JVisualVM (comes with the JDK) or YourKit. From a profile you can observe where you spend the most time and focus on optimizing that.
I am looking for performance tool which should have record and use function, I looked into jmeter though. Is there any other tool? which has record option too
I understand manual editing will be there even we record and playback,but at least I don't need to give URL manually. I have used jmeter, manage engine Qengine too.
I also tried to convert my selenium scripts to jmeter performance testing ,but I am not able to get every URL's request /response.
requirements :
http/https
record and playback - I can write scripts
option to substitute parameter from response etc. (which is dynamic)
You may try Silk performer or HP Load Runner. But both the tools are expensive to buy. Other opensource tools include Grinder