Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 days ago.
This post was edited and submitted for review 3 days ago.
Improve this question
Let's assume if I am doing 1000 users load testing using jmeter and blazemeter on a form page (survey form) is there any way those 1000users can reflect on submission page
I was getting entry on only one user
if this is the from i am filling and recording it with the help of blazeMeter
And I am only getting reflection of one user on the backend after load testing 1000users
This is how it's looking in jmeter
[what should I do so there will 1000 reflection of different id in my backend]
If you recorded your request using JMeter's HTTP(S) Test Script Recorder - you will have:
1 selected option
1 login
If you want to properly simulate 1000 different users you should:
Ensure that your JMeter test is configured to behave like a real browser
Ensure that each JMeter virtual user has its own username and password (or whatever is the way of identifying the user), the most commonly used test element for parameterization is CSV Data Set Config
It would also be a good idea to use different survey options for each user, you can either use the aforementioned CSV Data Set Config for this for pre-defined test data or if you prefer random - check out HTML Link Parser and Poll Example
Related
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 11 hours ago.
Improve this question
I'm trying to build an auto-complete/search Slack channels in a usual drop down, where I need the list of all the Slack channels the user has access to in a workspace.
Please note that this list of channels will be displayed in a dropdown in a web application and not inside the Slack application. I will need the following information to display in the dropdown for autocomplete/search functionality:
All public channels in the workspace.
Private channels the user is a part of.
Group chats the user is a part of.
Direct messages
I've tested the conversations.list method to achieve this (https://api.slack.com/methods/conversations.list), but I feel that I might run into rate limiting issues quickly. The documentation mentions a tier 2 rate limiting for the method. Quoting from the documentation
Limit - 20+ per minute. Most methods allow at least 20 requests per minute, while allowing for occasional bursts of more requests.
The statement is a little confusing as it doesn't specify an upper bound for the rate limit. Especially the keyword "at least 20 requests per minute" is a little confusing. How many requests can I make and hit the method in one minute? Only 20? More than 20? 50? 100? What is the cap?
Also, in my use case, there are high chances that there would be users from workspaces having thousands of public channels and few users trying to search for the channels concurrently in the web application for their workspace. And this is the reason I feel I might run into rate limiting issues.
How can I accomplish this scenario?
FYI, the backend is in Java for the web application.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 12 months ago.
Improve this question
For reporting purposes in an organization, someone exports the result of a query on an oracle database, after changing the date parameters every month, and sends the excel on outlook, to a receiver (analyst), there are different receivers (analysts) and different queries with With N:N (many-to-many) relationship between them
I m working on making this process more "automatic", I thought of these approaches:
Deploy a web application on my computer, with an Authentication page, then every user is taken to a list of reports that he is allowed to view, then he can choose a maxdate and a mindate value, and then he can download the excel file, with data exported from the oracle database
A batch script that's executed every end of the month (or a date chosen by the analysts), executing the oracle query, exporting the result to an excel file: 2.1 And then send the file on outlook 2.2 OR save the file on a folder on my computer, and make that file accessible on the local network by the different analysts
I want to get opinions on other approaches (hopefully more minimal and easiest to scale), and what's the pros and cons of the two approaches I've presented, and how I can best implement them
Option 1 sounds like Oracle Application Express (Apex). Even if you aren't an experienced developer, in a matter of a few hours you should be able to create a working web application.
What should you do?
talk to DBA, ask them to install Apex
when they provide login credentials to you (presuming they'll also create a workspace for you), create a new application
you'd mostly use Interactive Reports
if all data you need is in one table, even better
if not, you'll have to write a query which joins several tables, but hey - you already have those queries, haven't you?
Interactive Report lets users filter data in various ways
you can download result in Excel format so they can analyze data the way they are used to; or, perhaps even better, continue using Apex
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
Im currently new to end to end testing and planning to do load testing for a website I am currently working on. Im currently looking into jmeter and studying how to use it. My question is, would it make sense to only use one credential for the test? So basically I will be using my credentials then would throwing same HTTP requests multiple times to the server to simulate several users logging in and using the website.
Also if there are other ways to do load testing without using more than one credentials would be helpful!
Thanks in advance for the help!
It depends on your use cases and your site implementation, possible problems could be:
The site may not allow multiple logins under the same credentials like subsequent login will "throw out" the previously logged in user(s)
Depending on how session is being established/maintained you may receive the same Cookies for the same login
Most probably you will be able to implement browsing, but CRUD operations can be a big question mark
From JMeter's perspective it is not a problem to use only one account, any constraints will be on the system under test side.
Ideally you should treat each JMeter thread (virtual user) as the real user and it worth creating that many users as you plan to simulate and use CSV Data Set Config to parameterize your JMeter test so each virtual user could have its own credentials
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Is there test environment (URL) which is available on the internet to practice my load tests?
I need to test my JMeter test simulating x2000 virtual users, is there a portal online which would enable me to test my newly created load test?
No. Running load tests against public URLs is considered a Denial-of-Service attack. Don't do it.
You should instead run tests against localhost or some other server you own, as long as this is just for "practicing" your tests.
In case you need to mock a specific response, it may be easiest to use a http mock server like wiremock, though simple request mocking can also easily be achieved with nodejs or similar.
There are some sites which are designed for practicing load testing, i.e.
http://blazedemo.com
According to the description
The is a sample site you can test with BlazeMeter!
There is also http://newtours.demoaut.com/ which is designed for practicing QTP and/or LoadRunner however I didn't find any explicit permission to test it with other tools.
In general it is better to use web application you own for exercises as this way you will be able to see the impact of your load using i.e. PerfMon plugin so you will be able to analyse your results having way more information from the application under test infrastructure.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I have a Visual Studio load test that runs through the pages on a website, but have experienced big differences in performance when using a load balancer. If I run the tests going straight to Web Server 1 bypassing the load balancer I get an average page load time of under 1 second for 100 users as an example. If I direct the same test at the load balancer with 2 web servers behind it then I get an average page load time of about 30seconds - it starts quick but then deteriorates. This is strange as I now have 2 web servers load balanced instead of using 1 direct so I expect to be able to increase load. I am testing this with Azure Web Application Gateway now, and Azure VMs. I have experienced the same problem previously with an NGinx setup, I thought it was due to that setup but now I find I have the same on Azure. Any thoughts would be great.
I had to completely disable the firewall to get the consistent performance. I also ran into other issues with the firewall, where it gave us max entity size errors from a security module and after discussing with Azure Support this entity size can not be configured so keeping the firewall would mean some large pages would no longer function and get this error. This happened even if all rules were disabled, I spent a lot of time experimenting with different rules on/off. The SQL injection rules didn't seem to like our ASP.NET web forms site. I have now simulated 1,000 concurrent users split between two test agents and the performance was good for our site, with average page load time well under a second.
Here are a list of things that helped me to improve the same situation:
Add non-SSL listener and use that (e.g. HTTP instead of HTTPS). Obviously this is not the advised solution but maybe that can give you a hint (offload SSL to the backend pool servers? Add more gateway instances?)
Disable WAF rules (slight improvement)
Disable WAF + Added more gateway instances (increased from 2 to 4 in my case) - SOLVED THE PROBLEM!