If there were no visits to the dotnetnuke portal for some time, depending on the host, the dotnetnuke is unloaded from server's memory. The first visitor that comes next is then forced to wait about 15+ seconds (depending on server's speed) in order to see the page. Dotnetnuke is a big framework but sometimes ends up serving low traffic sites also and that causes this unpleasant situation.
My idea is to show the static index.html page which holds the html code from the default.aspx dotnetnuke page and execute XMLHttpRequest via javascript onLoad to "get" the default.aspx to start loading.
My first page is a long sales letter and I would like the dotnetnuke to load in background while people read the sales letter and hopefully the page will be loaded when they decide to click a link or a menu item that links to some dotnetnuke page. If they click the link before it loads and are the first to visit the site they will still have to wait, but the ones that are there for the very first time (and are reading the letter) are the ones I don't want to loose over waiting for the site to come up.
So every time when I modify the content on default.aspx I would need to create a new index.html page and overwrite the old one.
I can do that with the same XMLHttpRequest in just a few lines of code.
So, what do you think?
Since I'm not all that in web development or dotnetnuke I want your opinion and/or your advice how people deal with this.
ps. since my host doesn't let me to install dotnetnuke into a root folder I had an index.html page that redirected to /dnn/Default.aspx
This way I can actually have the index.html page with all the content in my root folder and let the search engines take that page instead /dnn/Default.aspx and would not have to preform any automatic redirects.
pps. I am aware of the services that check your site for free every 15 or 30min, but that's not the answer since it messes up my visit statistics.
I think you might be overthinking this.
Sign up for a free web site monitoring service (there are tons out there) that will check your site every 10 or 15 minutes. This will keep the application in memory and give you a bonus of having your uptime monitored.
Ivan,
There are many cheap "ping" solutions that can keep your site alive by making periodic requests to the web server. I would consider this option first, as it won't require any special coding on your behalf.
Just google dnn keep-alive; http://www.google.com/search?q=best+dotnetnuke+keep+alive+solutions&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a
Related
I have just started developing for a few weeks now and I bought a domain, but when I upload the files on live, the website looks different than what I have uploaded. Now, this gets fixed when I clear my cache. The problem is that my visitors enter, they see the page in a way, and after I update it they see it as the previous version!
Is there any possible solution for this? I don't want my visitors to clear cache every time I make a change on my website!
This is quite probable to be due to css cache. Your server is loading a cached version. You can specify the cached time in a few ways. Etags and htaccess (on apache) are the most common.
A very simple trick is just to add at the end of your style link url (where you load your main style in the head of the document) a get-like parameter: just like this:
main.css?v=2
I have a website forum where users exchange photos and text with one another on the home page. The home page shows 20 latest objects - be they photos or text. The 21st object is pushed out out of view. A new photo is uploaded every 5 seconds. A new text string is posted every second. In around 20 seconds, a photo that appeared at the top has disappeared at the bottom.
My question is: would I get a performance improvement if I introduced a CDN in the mix?
Since the content is changing, it seems I shouldn't be doing it. However, when I think about it logically, it does seem I'll get a performance improvement from introducing a CDN for my photos. Here's how. Imagine a photo is posted, appearing on the page at t=1 and remaining there till t=20. The first person to access the page (closer to t=1) will enable to photo to be pulled to an edge server. Thereafter, anyone accessing the photo will be receiving it from the CDN; this will last till t=20, after which the photo disappears. This is a veritable performance boost.
Can anyone comment on what are the flaws in my reasoning, and/or what am I failing to consider? Also would be good to know what alternative performance optimizations I can make for a website like mine. Thanks in advance.
You've got it right. As long as someone accesses the photo within the 20 seconds that the image is within view it will be pulled to an edge server. Then upon subsequent requests, other visitors will receive a cached response from the nearest edge server.
As long as you're using the CDN for delivering just your static assets, there should be no issues with your setup.
Additionally, you may want to look into a CDN which supports HTTP/2. This will provide you with improved performance. Check out cdncomparison.com for a comparison between popular CDN providers.
You need to consider all requests hitting your server, which includes the primary dynamically generated HTML document, but also all static assets like CSS files, Javascript files and, yes, image files (both static and user uploaded content). An HTML document will reference several other assets, each of which needs to be downloaded separately and thus incurs a server hit. Assuming for the sake of argument that each visitor has an empty local cache, a single page load may incur, say, ~50 resource hits for your server.
Probably the only request which actually needs to be handled by your server is the dynamically generated HTML document, if it's specific to the user (because they're logged in). All other 49 resource requests are identical for all visitors and can easily be shunted off to a CDN. Those will just hit your server once [per region], and then be cached by the CDN and rarely bother your server again. You can even have the CDN cache public HTML documents, e.g. for non-logged in users, you can let the CDN cache HTML documents for ~5 seconds, depending on how up-to-date you want your site to appear; so the CDN can handle an entire browsing session without hitting your server at all.
If you have roughly one new upload per second, that means there is likely a magnitude more passive visitors per second. If you can let a CDN handle ~99% of requests, that's a dramatic reduction in actual hits to your server. If you are clever with what you cache and for how long and depending on your particular mix of anonymous and authenticated users, you can easily reduce server loads by a magnitude or two. On the other side, you're speeding up page load times accordingly for your visitors.
For every single HTML document and other asset, really think whether this can be cached and for how long:
For HTML documents, is the user logged in? If no, and there's no other specific cookie tracking or similar things going on, then the asset is static and public for all intents and purposes and can be cached. Decide on a maximum age for the document and let the CDN cache it. Even caching it for just a second makes a giant difference when you get 1000 hits per second.
If the user is logged in, set the cache pragma to private, but still let the visitor's browser cache it for a few seconds. These headers must be decided upon by your forum software while it's generating the document.
For all other assets which aren't access restricted: let the CDN cache it for a long time and you can practically forget about ever having to serve those particular files ever again. These headers can be statically configured for entire directories in the web server.
I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.
I am facing a very strange situation and don't know how to solve it...Please help me solve the issue...
I am working on a web site where a research page is created to measure the performance of tasks done in the web site. It is one type of report page which checks for different conditions into the database tables, retrieves the information and send an email to the administrator. The page runs in every hour that is 24 times per day.
Now what the issue is: The web site works correctly however when the research page runs the other pages of the web site do not work correctly. That is say for example I am on the Page1 and at the same time the research page start running. Now at this time - when research page is running - if I click on the link of Page2, the Page2 will not get displayed until research page finishes its working. Can anyone tell what could be the issue for this behavior?
Here are some more information regarding the issue:
The web site is in Visual Studio 2008 (C#) and SOL Server 2008 is
used
The SOL query is too complex for research page however, I have made
all the optimization which are possible.
There are two connection strings (with different user for same
database) used in the web site. One for the Research page and second
for all the other pages in the site
Please help me find out the issue... Thanks in advance....
This may be due to mishandling of the Thread within you website. Have you tried it by Threads and Building Asynchronous Handlers in Your Server-Side Web Code
Just check out this might help you with :
http://msdn.microsoft.com/en-us/library/ms741870.aspx
msdn.microsoft.com/en-us/library/ms741870.aspx
http://www.albahari.com/threading/part3.aspx
And also do take care of releasing the necessary resources which might be locks up the Table, even though the Thread function gets over.
I have built a test website using nopCommerce open source , Everything is working fine , i need to know , why my website loading time is greater than 6 sec , the homepage works fine but the categories when clicked takes like 6-10 secs. how can i check the http request and calls to db so that i can track which function is taking a long times.
Test website is test website
Thanks
Things I would try in that order:
MvcMiniProfiler.
Analyze my code for possible performance bottlenecks using a .NET profiler.
Finally submit bugs to the nopCommerce support if the previous approaches didn't yield anything fruitful that would put my code into cause.
In between I might also checkout with my hosting provider whether he is not the cause of the slowness.
As a quick and dirty check, you can add the time it takes to generate the response as a column in the IIS logs - that will give you some idea as to whether the server is being slow to serve the pages or you need to do some front-end optimisation work.
On the front end side the first thing you need to do it merge all the CSS files for a theme into one to save on roundtrips - the browser can't render the page until it's got the CSS
All the .js files you have in the head will also block the page, can you merge them and load them later?
The performance of imagegen.ashx looks on the slow side - do you need to generate the banners on the fly or could they be pre-generated?
If the back-end side of generating the page is slow, there are some scripts around the web to show which queries are using the most CPU, making the most IO ops etc.
Below is a list of things you can improve,
1.Combine your js.
There are a few things you can use, for example, jsMin, you can read this [post] http://encosia.com/automatically-minify-and-combine-javascript-in-visual-studio/. However, jsmin doesn't seem to compress the combined js.
Another option is [jmerge] http://demo.lateralcode.com/jmerge/ It kinda does it after the fact, in the sense that you need to have the site ready to cobine them with jmerge since it only take a http link.
The best one I'v known so far is bundling and minification feature of MVC4. It's part of MVC4, however, you can get a Nuget package for you MVC 3 app.
Word of advice: bundling every js of yours is not necessarily a good idea, it even backfires someimtes, since you will end up with a big js that browser will have to download sequentially, instead of downloading several smaller ones. (you might want to look into head.js to make js download parallel) So the trick here is to keep the balance. I end up have a jquery from google CDN and bundled the rest of my js into one.
2.Put js at the bottom of the page so the browser doesnt have to load the js first before it starts to render the page. But you need to be careful with this one though, since normally you will have jquery functions doing stuff upon document.ready() at the header of the page, I adviese you moving that to the bottom of the page as well, if possible.
If you move the js reference and scirpt block in you layout page to the bottom, then you will most likely run into problem with nested js reference and js script blocks in your individual view. No worries, then you need to look into using #section (probably suitable for a discussion in an other thread) in your view and render it in your layout page, so that the referenced and script block inside your view get rendered at the bottom of the page at run time.
2.Use CDN
Pretty straight forward.
3.Combine CSS
Combine them into one, with the same tool you use for combining js, but you need to reference it at the page header, instead of the bottom.
4.Enable static content cache, something like this in your web config file
It won't help with first time load, but definitely will make it a lot faster for returning user.
5.Enable url compression
Time to first load
This is one of the metrics used by webpagetest.org. But dont bang your head against this one too much, as it basically says how fast your web server can serve the content. So probably not much you can do here form the software end.
Hope that would help!
NopCommerce is deadly slow, and the developers doesn't look in to the performance issue seriously. I have seen lot of performance related forums left unanswered. So best luck.