I have to conduct performance testing of an application which is not a web server. This application picks the data from a database and pushes into Kafka. What could be the strategy? In particular, when I look at the JMeter for ex, it says about making a REST request and getting the response to measure the peeformance. But in my case, this application does not server any service. I was wondering how to proceed..
This application picks the data from a database - what is the trigger for the application to pick the data from the database? If it is something which can be invoked externally you need to identify what network protocols are being used and whether JMeter supports this protocols with its Samplers or via JMeter Plugins or if there are client libraries you can use from the JSR223 Test Elements
If you can trigger this reading from the database and pushing into Kafka event - you're good to go, if not - you need to identify the scope, to wit what you're testing and what you're trying to achieve.
If you need to load test the application itself - it makes sense to use profiling tools to check what are the slowest functions, largest objects, the most resource consuming routines, etc.
If you need to load test the database, to wit simulate the application reading the data from the database at high rate - this can be done using JMeter's JDBC Request sampler, check out Building a Database Test Plan article for more details
If you need to load test your Kafka instance - it can be done using Pepper-Box - Kafka Load Generator, check out Apache Kafka - How to Load Test with JMeter article for comprehensive information.
Related
I'd few ques on technical details of JMeter mostly pertaining to distributed setup vs independent JMeter engines (since JMeter controller can become a bottleneck in case of several JMeter load generators). Would be great if anybody can help with the understanding here -
How is JMeter distributed setup orchestrated by JMeter controller (i.e. called master or client)? Can we use the same logic to synchronize test among independent JMeter engines (independent mode)?
Is there a way to pool connections across vUsers?
Function of ASYNC_QUEUE in backend listener and it's expected side-effects in independent mode (mentioned above), what happens when queue is full?
Does/Is there a way for JMeter to execute javascript/act as headless browser?
How does DNS resolution happen for JMeter? Does it resolve for each vuser?
Your "question" looks like a compilation of interview questions rather than something connected with your single current concern and I don't think it's a proper place/way to ask it, I believe it should be: one post - one question.
Whatever
How is JMeter distributed setup orchestrated by JMeter controller - JMeter master sends .jmx script to slaves and collects results from them. Theoretically you can implement your own mechanism for delivering the test plan and eventual dependencies to the individual JMeter engines and running the test at the same time. Then you will need to collect the .jtl results files from the engines and combine it into a single one.
Is there a way to pool connections across vUsers? - JMeter does it internally
When the queue is full no more new sample results will be taken for processing by the backend listener so the results won't be "realtime" anymore, you will see the new results as free slots will be appearing in the queue
For JMeter per-se - no, AJAX calls can be simulated using Parallel Controller, for client-side performance testing, JavaScript execution profiling and rendering speed measurement you will need to use a read browser, no matter normal or headless, there is WebDriver Sampler plugin providing JMeter integration with Selenium
DNS resolution is dependent on underlying OS and/or JVM DNS resolution implementation, there is DNS Cache Manager which enables overriding hosts entries and using custom DNS resolver so each thread looks up the IP address on its own
I have a spring boot app that listens to a queue so it is non restful. Is there any easy way to get a time breakdown for the amount of time spent in each service?
The options are in:
Use a Profiler tool when running your application, this way you will be able to trace down the time even to a single function call
If you have an APM tool in place you can collect the same information plus metrics from the operating system, database, message broker, etc.
Just in case you need to generate messages and "feed" them to your application it could be done using i.e. Apache JMeter tool, see Building a JMS Testing Plan article for more information if needed
I have a bunch of server logs with api requests I'd like to replicate for testing. Is there an easy way to "export" those logs {uri: $path, query: $queryparams} as input for blazemeter to test?
Jmeter is open source. Just modify the source to replay the logs and to handle the user input and dynamic components. Every time you see a new source IP, fork a new thread. Use the delays between requests to define the think time. And you will have created "Web replay," the equivalent of Oracle's DB Replay
There might be a chance that JMeter's Access Log Sampler will work for you if your server is in: Tomcat, Resin, Weblogic, and SunOne.
If it's not you might need to convert the log to the supported format or implement your own versions of LogParser and Generator
More information: The JMeter Access Log Sampler - A Guide
I am automating functionality of API using JMeter. I just passed input parameters using JSON and asserted with expected result like 'Registered successfully'. My doubt is Whether I need to check the values saved in DB. If yes how can I do it in Jmeter.
JMeter provides JDBC Request Sampler which allows executing arbitrary SQL queries. You need to
Download relevant JDBC driver for your database management system and put it somewhere in JMeter Classpath (normally lib folder of your JMeter installation). JMeter restart will be required to pick the library up
Add JDBC Connection Configuration test element and specify database URL, credentials and other parameters if needed
Using JDBC Request sampler execute SQL query to validate that database contains the expected value(s)
See The Real Secret to Building a Database Test Plan With JMeter article for comprehensive instructions and configuration examples.
You could use JDBC Sampler & configurations for DB validation. However, I would suggest you to make use of other APIs (if any) to verify if they are present in the system/DB. some get request might bring the registered info.
I do a lot of API testing. sometimes we run these automated tests in higher environments like staging / PROD as part of sanity test after prod push. If you think that you might do something similar in the future, then you would not have prod DB config details - your test will be limited to run only in the lower environments & would not work in PROD. So, try to avoid DB validation.
Once you have successfully executed the API, add JDBC sampler after that, write query to count the number of rows in the db. If this count is growing which means the API is successfully inserting the data in the DB.
Once you have the count, write a beanshell script to print the count and compare it with older count and based on the comparison raise an assertion. This way you can be sure that the data is being inserted.
I will also recommend not to use this approach or any additional load when you are running your actual tests as the numbers you will get or system monitoring data you will collect will be with additional query which in turn will not be real life scenario or your actual test plan.
The best practices for API testing involve checking the result of the API call against an SQL query result.
Use the above details for setting your db connection.
I second with Vins. Using database validation to assert the reaponse from API limits the test capabilities and can not be scaled to higher environments where you have limited or no access.
Also you can not reuse the functional tests to run load tests as the number of users increases the more data gets inserted into database and slows down the test execution as more data pump in.
Also might be case that some select queries gets struck on as data set is more or due to network bandwidth or less memory alloxated to Jmeter
You might also face out of memory errors in java as it keeps trying to garbage collection to accomodate the large data set
Recommended apporach is to use front end validations if available and wherever applicable or make use of other APIs query and validate the data.
I am looking to monitor live user responses through jmeter. Can the backend listener in jmeter used to record live users(end users)? I am not talking about virtual users that we set up in jmeter. But the real end users.How can this be achieved?
Editing to add more details:
Our requirement is to monitor the real users, in 2-3 geographical locations, all through out the business hours..say from 8 to 5.
For this purpose, do you think, I need to have a dedicated machine with jmeter, grafana and influxdb for monitoring alone? I have other testing going on using jmeter and I don't want to use the same machine to do both monitoring and testing. DO you think this is achievable by jmeter? ANy suggestions?
you can use the following tools in combination to achieve live monitoring:
JMeter backend Listener - to send results to influxDB
influxdb - store the results sent by backend listener
grafana - run continuous queries for metrics and plot graphs like average response times etc.
Follow the steps mentioned here:
http://jmeter.apache.org/usermanual/realtime-results.html - First Option
https://www.linkedin.com/pulse/jmeter-live-performance-monitoring-dashboard-grafana-influxdb-sarker
http://www.testautomationguru.com/jmeter-real-time-results-influxdb-grafana/
http://techblogsearch.com/a/live-performance-result-monitoring-with-jmeter-grafana-influxdb.html
We use to perform general production app (here- Scada-LTS) monitoring by javamelody. But this will give You general statistics. For per user monitoring it seems You should use log4j + ELK or other simpler syslog analyzing tool.
Jmeter should be used rather for test environment for stress tests.