Configuring MiniProfiler to work on multiple domains - mvc-mini-profiler

I want to add MiniProfiler to a project with multiple domains servicing requests:
application.domain.com (Serves all HTML, JavaScript etc)
api.domain.com (REST/JSON API)
One of the best features of MiniProfiler is how AJAX calls show up, out of the box the above doesn't work. Does anyone have any suggestions on an approach / configuration I could use to enable MiniProfiler events from api.domain.com to show up on pages in application.domain.com?

It can be done, but would require some extra setup steps on your part:
Set up the MiniProfiler.Settings.Storage to save to some storage medium (like a redis cache or sql server) that is accessible from all domains. That will allow all domains being profiled to save their results together.
Be sure to set MiniProfiler.Current.User to have the same value on all domains. The default behavior when retrieving results is to show all results for a specific user. So you want to make sure that profiles for the same user are being stored consistently across multiple domains.

Related

Having multiple AngularJS apps for one site?

I am developing a site that can be broken down to a handful of main pages. These pages can be thought as isolated from each other, except they share the session data (ie. session id and logged-in username).
Initially, I was gonna build the site as a SPA using ng-view (ie. make the pages into AngularJS views). But then, I don't see any benefits for my site to be implemented in that way. And it would require extra time and efforts to make it support SEO (Making AJAX Applications Crawlable).
Going with an approach that does not provide any benefits and even creates extra workload doesn't seem to be too smart. So I thought to myself, why don't I make the main pages of my site into individual AngularJS apps. The parts of the site that need to be indexed by search engines are simply the initial screens of some of those apps, so I wouldn't need to do extra work for SEO. (Note: The initial screens are rendered by the Django server with data for search engines to crawl, so they are non-blank.)
For each of the apps, it may or may not have its own set of partials, depending on the requirements on it.
Example:
mydomain.com/item_page/1234 (load "item" app)
mydomain.com/dashboard (load "dashboard" app)
mydomain.com/account (load "account" app and default to "tab_1" view)
mydomain.com/account#tab_1 (load "tab_1" view of "account" app)
mydomain.com/account#tab_2 (load "tab_2" view of "account" app)
mydomain.com/post_item (load "post" app)
This is solely my random thought and I haven't seen any AngularJS examples that are comprised of multiple AngularJS apps. I would like to know:
Is the multiple-AngularJS-apps for one site approach feasible? What are some caveats that I should be aware of? Are there any example site out there in the wild that's taking this approach?
If feasible, how do I share the session data between the apps?
Note this post is about multiple AngularJS apps for one site, not multiple AngularJS apps on the same page.
There is nothing wrong with such approach, as long as you keep the size of downloaded JS script small enough, and ensure good caching. One of examples of such applications can be GitHub (they are not using angular, but approach is the same). When you go Issues page on GitHub, it loads an html page, common Github JS libraries and page specific JS code. Navigation and actions inside page, are handled by that single page specific script. If you go to other section (like Code) a new page with new page specific JS code will be loaded. Another example is Amazon AWS console, they even use different frameworks for different pages. (both GitHub and Amazon don't use Angular, but this approach works for any JS based framework, even for GWT).
As for sharing some session data between pages, you can embed this info directly in the page itself, using inline scripts or hidden elements. E.g. when your server is generating page, it should also generate some session information into the page. Another approach is to download session data once, and store them in local storage/session storage.

Possible to use one codebase for a subdomain for multiple sites?

I don't even know if this is even possible, but I thought I'd ask.
I am creating a small CRUD application but I have multiple sites. Each site would use the CRUD. The application would have common CRUD methods and style, but each individual site would apply different forms. I want to avoid creating multiple CRUD applications that varied only in specific content (just different forms).
I want to have something like this:
mycrud.website1.com
mycrud.website2.com
mycrud.website3.com
I can create a subdomain for each individual site no problem. But is it feasible to point all the subdomains to one MVC application directory? And if it is possible any suggestions for how I might go about restricting users from website1 from seeing website2 or website3 content? Is that something "roles" could take care of (after authenticating user)?
Thanks.
There are a lot of websites that do this, not just with MVC. Some content farms point *.mydomain.com to a single IP and have a wild card mapping in IIS.
From there, your application should look at the URL to determine what it should be doing. Some CMS systems operate in this manner, using the domain as a key to deciding what pages to load.
I've built a private labelable SAS application (Software as a Service) that allows us to host all of our clients in a single application. Some clients have customizations to pages or features. We are able to handle that by creating custom plugins for each client that over-ride the Controllers or Views when needed.
All clients share a common code base and aside from each clients custom theme/template they are the same. Only when a client had us customize one feature did we need to build out their plugin DLL. Now, this is advanced stuff so it would require heavy modifications to your code base but in the end if it's what your application needs it is 100% possible.
First - the easy part is having one web site for all three domains. You can do that simply with DNS entries. No problem. All three domains should point at the same ip.
As far as the content, you could do that in a number of ways. I think your idea of roles is pretty solid. It also leaves open the possible of a given user seeing content from both site1 and site2, if that would ever be necessary.
If you don't want to force users to authenticate, you should look at other options. You could wrap your CRUD logic and data access logic into separate libraries and use them across three different sites in IIS. You could have one site and display content based on the request URL. There's probably a lot of other options too.

How to load test specific flow on a website?

I used some load testing tools like siege, apache jmeter, httperf which are really useful and suit a lot of cases.
However now I need to benchmark a product sale process that consists of several pages including:
froms that should be filled with random data and submitted
cookies / sessions
concurrent requests
invalid form data
ajax requests (form data validation)
In short I would like to simulate a lot of users concurrently buying a product on a webshop (its a so called guest-checkout, so no registration etc is needed)
Right now I am trying to write something in php/curl, specific to the website, but I thought there must be some tools available that I can use. Can somebody point me in the right direction?
I do not need requests from different ip addresses, because the resources expensive stuff happenes all on the backend.
Our product, Web Performance Load Tester, will do everything you mentioned. Filling out and submitting forms is easy and you can generate random sets of data to fill them with, if needed. It handles cookies automatically (unique for each user) and can simulate any number of requests concurrently - by default it will use the same number as the browser you recorded with. Submitting invalid form data is no different than valid data, you simply put invalid data into the set that feeds the form. You can add validators to check for the success/failure of any request or page. It can handle AJAX requests, though this sometimes requires a few extra configuration steps. The quickest way to get an overview of the product is to watch the first two of these videos.
Jmeter allows you to script journeys along the lines you describe, and create random values for things like forms. It's got a fairly steep learning curve - but it's got to be better than writing things from scratch!
Visual Studio WebPerformanceTest and LoadTest will do what you want. You can create a single data driven test (where you pre-create a bunch of test data), or a single test that uses a plugin that on the fly can generate random data. This requires licenses for Visual Studio Ultimate or Team Server.
Check out our Fiddler add-on StresStimulus. It does what you described without scripting: records navigation scenario in browser or Fiddler, replays it with configurable load, automatically supports cookie / session correlation for multi-users, monitors in real time and reports errors and timeouts. To generate random data, use the free service www.generatedata.com and databind the csv files to web form fields in StresStimulus. For more complex load tests, or if AJAX pages break, Fiddler .NET scripting is available. Up to 100 VUs per machine are free. For more VUs, look at low-cost weekly / monthly subscriptions.

How to maintain state in a web app - as HTTP is stateless

I am new in building web apps and just begun learning and setting up Grails. I am planning to build an app which has a flow of 4 to 5 pages. Since HTTP is a stateless protocol, how is the state between the pages maintained usually. I am curious what is the accepted standard here, should I create session scoped objects and use them between pages or keep passing around the values between pages (not sure if it is effective if I have a large number of items on a page). Or instead of using 4 to 5 pages should I just use one page with multiple divs and show/hide based on the user clicks?
I think using domain objects in Grails would help here but I dont have a DB backing the UI and only some webservices which will do the UI actions so I cant use domain objects.
A Grails specific solution would be good but also wanted to know how this is handled in web development in general.
Without using a DB, there are a few options you could use:
Use POST/GET variables to pass info from page to page.
Use the session to store information.
Use cookies to store information.
Using POST/GET is usually best if you just have one page "talking" to one other page (e.g. submission of a form). If you have a bunch of data that will be shared by several pages, the best way to do it would probably be to put them in the session. If you need those values to stick around after the user leaves your site and comes back later, then you might want to use cookies.
You may want to look into WebFlow (Spring WebFlow) in Grails. I find it helpful in wizard like or shopping cart like applications where you want to hold on to the data between a group of pages (ie: Page 1, Page 2... Page 4) and then at the end submit the data somewhere etc.

Question about using subdomains to force caching

I haven't had a huge opportunity to research the subject but I figure I'll just ask the question and see if we can create a knowledge base on the subject here.
1) Using subdomains will force a client side cache, is this by default or is there an easy way for a client to disable it? More curious about what kind of a percentage of users I should be expecting to affect.
2) What all will be cached? Images? Stylesheets? Flash SWFs? Javascripts? Everything?
3) I remember reading that you must use a subdomain or www in your URL for this to work, is this correct? (and does this mean SO won't allow it?)
I plan on integrating this onto all of my websites eventually but first I am going to try to do it for a network of flash game websites so I am thinking www.example.com for the website will remain the same but instead of using www.example.com/images, www.example.com/stylesheets, www.example.com/javascript, & www.example.com/swfs I will just create subdomains that point to them (img.example.com, css.example.com, js.example.com & swf.example.com respectively) -- is this the best course of action?
Using subdomains for content elements isn't so much to force caching, but to trick a browser into opening more connections than it might otherwise do. This can speed up page load time.
Caching of those elements is entirely down the HTTP headers delivered with that content.
For static files like CSS, JS etc, a server will typically tell the client when the file was modified, which allows a browser to ask for the file "If-Modified-Since" that timestamp. Specifics of how to improve on this by adding some extra caching headers would depend on which webserver you use. For example, with Apache you can use the mod_expires module to set the Expires header, or the Header directive to output other types of cache control headers.
As an example, if you had a subdirectory with your css files in, and wanted to ensure they were cached for at least an hour, you could place a .htaccess in that directory with these contents
ExpiresActive On
ExpiresDefault "access plus 1 hours"
Check out YSlow's documentation. YSlow is a plugin for Firebug, the amazing Firefox web development plugin. There is lots of good info on a number of ways to speed up your page loads, one of which is using one or more subdomains to encourage the browser to do more parallel object loads.
One thing I've done on two Django sites is to use a custom template tag to create pseudo-paths to images, css, etc. The path contains the time-last-modified as a pseudo directory. This path component is stripped out by an Apache .htaccess mod_rewrite rule. The object is then given a 10 year time-to-live (ExpiresDefault "now plus 10 years") so the browser will only load it once. If the object changes, the pseudo path changes and the browser will fetch the updated object.

Resources