How to load test specific flow on a website? - performance

I used some load testing tools like siege, apache jmeter, httperf which are really useful and suit a lot of cases.
However now I need to benchmark a product sale process that consists of several pages including:
froms that should be filled with random data and submitted
cookies / sessions
concurrent requests
invalid form data
ajax requests (form data validation)
In short I would like to simulate a lot of users concurrently buying a product on a webshop (its a so called guest-checkout, so no registration etc is needed)
Right now I am trying to write something in php/curl, specific to the website, but I thought there must be some tools available that I can use. Can somebody point me in the right direction?
I do not need requests from different ip addresses, because the resources expensive stuff happenes all on the backend.

Our product, Web Performance Load Tester, will do everything you mentioned. Filling out and submitting forms is easy and you can generate random sets of data to fill them with, if needed. It handles cookies automatically (unique for each user) and can simulate any number of requests concurrently - by default it will use the same number as the browser you recorded with. Submitting invalid form data is no different than valid data, you simply put invalid data into the set that feeds the form. You can add validators to check for the success/failure of any request or page. It can handle AJAX requests, though this sometimes requires a few extra configuration steps. The quickest way to get an overview of the product is to watch the first two of these videos.

Jmeter allows you to script journeys along the lines you describe, and create random values for things like forms. It's got a fairly steep learning curve - but it's got to be better than writing things from scratch!

Visual Studio WebPerformanceTest and LoadTest will do what you want. You can create a single data driven test (where you pre-create a bunch of test data), or a single test that uses a plugin that on the fly can generate random data. This requires licenses for Visual Studio Ultimate or Team Server.

Check out our Fiddler add-on StresStimulus. It does what you described without scripting: records navigation scenario in browser or Fiddler, replays it with configurable load, automatically supports cookie / session correlation for multi-users, monitors in real time and reports errors and timeouts. To generate random data, use the free service www.generatedata.com and databind the csv files to web form fields in StresStimulus. For more complex load tests, or if AJAX pages break, Fiddler .NET scripting is available. Up to 100 VUs per machine are free. For more VUs, look at low-cost weekly / monthly subscriptions.

Related

Ways to programatically check if a website is up and functioning as expected

I know this is an open ended question, but hopefully it will get some good answers before the thread is locked...
I'm wondering what methods there are to programmatically check (language agnostic) if a website is online from a client perspective (assume you can't make changes to the site/server, but you can rely on certain behaviours of the site.)
The result of each method could stack to provide a measure of certainty that the site is up/down - that is, a method does not have to provide a definite indication if the site is up/down on its own.
Some common tests just to check 'upness' may be:
Ping the site (which in the case of shared hosting isn't very
indicative)
Send a http head/get request and check the status
Others I can think of to check that the site is up and functioning:
Check you received a well formed html response i.e. html to html
tags, if the site is experiencing trouble it may spit an error and
exit without writing the rest of the page (not all that reliable
though because the site may handle most errors in a better way)
Check certain content is or is not on the page, i.e. perhaps there is some content that is always present on your pages, or always present in the case of an error
Can anybody think of any other methods that could be used to help determine if a site is in fact up/down and functioning/not functioning correctly from within a program?
If your get request on a page that displays info from database comes back with status 200 and matching keywords are found, you can be pretty certain that your site is up and running.
And you don't really need to write your own script to do that. There are free services such as GotSiteMonitor, Pingdom, UptimeRobot etc. allows you to monitor your site.
Based your set of test on the unit tests priciple. It is normally used in programming to test classes, modules or other artefacts after changes have been made. You can use any of the available frameworks, so don't have to reinvent the wheel. You must describe (implement) tests to be run, in your case a typical test should request a url inside the page and then do some evaluations like:
call result (for example return code of curl execution)
http return code
http headers
response mime type
response size
response content (test against a regular expression)
This way you can add, remove and modify single tests without having to care about the framework, once you are up. You can also chain tests, so perform a login in one test and virtually click a button in subsequent test.
There are also tools to handle such test runs automatically including visualization of results, statistics and the like.
OK, it sounds like you want to test and monitor your website from a customer experience perspective rather than purely establishing if a server is up (using ping for example). An effective way to replicate the customer experience is to simulate tests against the site using one of the headless browser testing tools (phantomJS is great a great choice) as they will render the page fully (including images, CSS, JS etc.) giving you a real page load time. These tools also allow you to make assertions on all aspects of the HTML content and HTTP response.
pingdom recently started offering a (paid for) service to perform these exact types of checks for alongside their existing monitoring solution. The demo is worth looking at, their interface for writing the actual tests is very nice.

Implement real-time updating notification feature

I'd like to implement some visual indicator of various sections for the items whose status is pending in my app similar to facebook's / google plus unread notification indicator...I have written an API to fetch the count to be displayed, but I am stuck at updating it everytime an item gets added or deleted, I could think of two approaches which I am not satisfied with, first one being making an API call related to the count whenever a POST or DELETE operation is performedSecond one being refreshing the page after some time span...
I think there should be much better way of doing this from server side, any suggestion or any gem to do so?
Even in gmail it is refreshed on client request. The server calculates the amount of new items, and the client initiates a request (probably with AJAX). This requires an almost negligible amount of data and process time, so probably you can get away with it. Various cache gems even can store the part of the page refreshed if no data changed since last request, which even solves the problem of calculating only when something changed.
UPDATE:
You can solve the problem basically two ways: server side push, and a client side query. The push is problematic, for various reasons, rarely used in web environment, at least as far as I know. Most of the pages (if not all) uses timed query to refresh such information. You can check it with the right tool, like firebug for firefox. You can see as individual requests initiated towards the server.
When you fire a request trough AJAX, the server replies you. Normally it generates a page fragment to replace the old content with the new, but some cache mechanism can intervene, and if nothing changed, you may get the previously stored cache fragment. See some tutorial here, for various gems, one of them may fit your needs.
If you would prefer a complete solution, check Faye (tutorial here). I haven't used it, but may worth a try, seems simple enough.

All-js or initial markup by the framework

Ever since I've started asp.net mvc development my experience is almost 80% jQuery, only 20% C#.
Now I am begginning to use Knockout.js to enable myself to better control view on the page.
The question I am now facing is: should I be feeding the browser the "sceleton markup page" and load all data via ajax call, which in turn populates a js viewmodel object and therefore the view, or should I initially populate data via a partial view, and use js page data management for subsequent client experience?
Right now I am doing the latter, but this requires me to write two data retrieval/display logic - one in js, one in mvc razor view.
I am not planning to support browsers with js disabled, so maybe I should do everything via js knockout view model initialization?
There are many additional variables.
How many request per second should your app handle in future? If many, then with full page generation maybe you can cache resulting web page, decreasing load on server.
What kind of clients do you have? If they are slow (like low cost mobile phones), then generating full HTML on client can be slow.
Do your clients appreciate fast response over slow network? With full server page generation you can achieve smaller number of requests and faster response.
On the opposite side, if this is an internal department level business app with good network, low number of requests and good client computers, then you can surely go with minimal initial page and populating everything with AJAX. Also, as Arbiter pointed out, JSON can be smaller in size than HTML, so if you have a big amount of data then you can save on network via JSON.
There is also a middle third way. You can generate JSON data and embed them directly in the webpage (like <script>CallMyJSGenerateMethod({generatedJSON: "goes here"})</script>). This way you'd have only one (JavaScript) procedure for HTML generation, small number of requests (with even lower amount of data) and ability to cache web page. Still, you'd have to have a good clients, so point 2 still stands.
This is more of an opinion, but in my estimation, its the most oft asked question I get regarding building web apps: do I build the pages with HTML/MVC on the server or do I use all JS? There is no clear right answer here that fits all scenarios. Both are great choices. Dmitry's points are all valid, too.
Other things to consider are whether you need to stick with ASP.NET on the server or if other server tech will be used (PHP?). What skills does your dev team have? Will the pages you are creating change a lot on the client, or are they relatively static?
I personally lean towards the client space instead of server side generation, but its mostly a preference.

Designing an application around HMVC and AJAX [Kohana 3.2]

I am currently designing an application that will have a few different pages, and each page will have components that update through AJAX. The layout is similar to the new Twitter design where 'Home', 'Discover', and 'Connect' are separate pages, but interacting within the page (such as clicking 'Followers' or 'Following') uses AJAX.
Since the design requires an initial page load with several components (in the context of Twitter: tweets, followers, following), each of which can be updated individually through AJAX, I thought it'd be best to have a default controller for serving pages, and other controllers with actions that, rather than serving full pages, strictly handle querying the database and returning JSON objects. This way, on initial page load several HMVC requests can be made to gather the data for each component, and AJAX calls can also be made to update each component individually.
My idea is to have a Controller_Default that handles serving pages. In the context of Twitter, Controller_Default would contain:
action_home()
action_connect()
action_discover()
I would then have other Controllers that don't deal with serving full pages, but rather components of pages. For instance, in the context of Twitter Controller_Tweet may have:
action_get()
which returns a JSON object containing tweets for a specific user. Action_home() could then make several HMVC requests to get the data for the several different components of the page (i.e. make requests to 'tweet/get', 'followers/get', 'following/get'). While on the page, however, AJAX calls could be made to the function specific controllers (i.e. 'tweet/get') to update the content.
My question: is this a good design? Does it make sense to have the pages served through a default controller, with page components served (in JSON format) through other function specific controllers?
If there is any confusion regarding the question please feel free to ask for clarification!
One of the strengths of the HMVC pattern is that employing this type of layered application doesn't lock you into a workflow that might be difficult to change later on.
From what you've indicated above, this would be perfectly acceptable as a way of serving content to a client; the default controller wraps sub-requests, which avoids multiple AJAX calls from the client to achieve the same goal.
Two suggestions I might make:
Ensure that your Twitter back-end requests are abstracted out and managed in a library to make the application DRY'er and easier to maintain.
Consider whether the default controller is making only the absolutely necessary calls on each request. Employ caching to avoid pulling infrequently changed data on every request (e.g., followers might only be updated every 30 seconds). This of course depends entirely on your application requirements, but if you get heavily loaded you could quickly find your Twitter API request limit being reached.
One final observation: if you do find the server is experiencing high load and Twitter API requests are taking a long time to return, consider provisioning another server and installing a copy of your application. You can then "point" sub-requests from the default gateway application to your second application server, which should help improve response times if the two servers are connected by a high-speed link.

Fetching data sent by lightstreamer

I'm using (not programming) an application that sends my browser, using a technology called "lightstreamer" (which i have no clue about), data every second or so (I guess using AJAX?). these are constant changing stock values.
Now... is there any program/thing I can use to automatically fetch/sniff/whatever the raw data that my browser gets, so that for example i could later paste it to Excel and create charts?
Why not just copy the data from the browser window you might ask, and the reason is that the application always shows me only the last 20 values for a given stock, and i wish to automatically get, let's say, the last 1,000 values and throw it to Excel.
Thanks :)
PS I see that the app is written in asp id it matters.
There is no way that I know of to (easily) reconstruct sniffed Lightstreamer communications into tabular data. Lightstreamer pushes updates to the client using a hidden IFRAME, but those updates are efficient, but intended for consumption only by the Lightstreamer client code.
Developers using the Lightstreamer Javascript API can easily hook into update events if they wish to.
However as an application user, you are best off raising a change request with the application owner to add some form of Excel export functionality.

Resources