We are using ABCPdf 11.3 for one of our clients. We create HTML dynamically and then store it somewhere on the server.
We call the AddImageUrl(filePath, False, 300, False) and it takes more than 2 minutes even for a single page. (filePath is something like 'file://E:/pankaj/{generatedpdf}.html"
I need to improve application performance. Since it's a single page, it shouldn't take more than 10 seconds. Even if I pass www.google.com as the parameter, it converts the google home to PDF in 10 seconds.
Appreciate your help friends.
This is resolved.
I could see there were a lot of external links and some javascript code on the page which was passed to ABCpdf to convert entirely.
I removed the scripts and the external CSS/js libraries and all other external links and it worked. Now the page is rendered within 2 seconds.
Thank you!
Related
I have built a test website using nopCommerce open source , Everything is working fine , i need to know , why my website loading time is greater than 6 sec , the homepage works fine but the categories when clicked takes like 6-10 secs. how can i check the http request and calls to db so that i can track which function is taking a long times.
Test website is test website
Thanks
Things I would try in that order:
MvcMiniProfiler.
Analyze my code for possible performance bottlenecks using a .NET profiler.
Finally submit bugs to the nopCommerce support if the previous approaches didn't yield anything fruitful that would put my code into cause.
In between I might also checkout with my hosting provider whether he is not the cause of the slowness.
As a quick and dirty check, you can add the time it takes to generate the response as a column in the IIS logs - that will give you some idea as to whether the server is being slow to serve the pages or you need to do some front-end optimisation work.
On the front end side the first thing you need to do it merge all the CSS files for a theme into one to save on roundtrips - the browser can't render the page until it's got the CSS
All the .js files you have in the head will also block the page, can you merge them and load them later?
The performance of imagegen.ashx looks on the slow side - do you need to generate the banners on the fly or could they be pre-generated?
If the back-end side of generating the page is slow, there are some scripts around the web to show which queries are using the most CPU, making the most IO ops etc.
Below is a list of things you can improve,
1.Combine your js.
There are a few things you can use, for example, jsMin, you can read this [post] http://encosia.com/automatically-minify-and-combine-javascript-in-visual-studio/. However, jsmin doesn't seem to compress the combined js.
Another option is [jmerge] http://demo.lateralcode.com/jmerge/ It kinda does it after the fact, in the sense that you need to have the site ready to cobine them with jmerge since it only take a http link.
The best one I'v known so far is bundling and minification feature of MVC4. It's part of MVC4, however, you can get a Nuget package for you MVC 3 app.
Word of advice: bundling every js of yours is not necessarily a good idea, it even backfires someimtes, since you will end up with a big js that browser will have to download sequentially, instead of downloading several smaller ones. (you might want to look into head.js to make js download parallel) So the trick here is to keep the balance. I end up have a jquery from google CDN and bundled the rest of my js into one.
2.Put js at the bottom of the page so the browser doesnt have to load the js first before it starts to render the page. But you need to be careful with this one though, since normally you will have jquery functions doing stuff upon document.ready() at the header of the page, I adviese you moving that to the bottom of the page as well, if possible.
If you move the js reference and scirpt block in you layout page to the bottom, then you will most likely run into problem with nested js reference and js script blocks in your individual view. No worries, then you need to look into using #section (probably suitable for a discussion in an other thread) in your view and render it in your layout page, so that the referenced and script block inside your view get rendered at the bottom of the page at run time.
2.Use CDN
Pretty straight forward.
3.Combine CSS
Combine them into one, with the same tool you use for combining js, but you need to reference it at the page header, instead of the bottom.
4.Enable static content cache, something like this in your web config file
It won't help with first time load, but definitely will make it a lot faster for returning user.
5.Enable url compression
Time to first load
This is one of the metrics used by webpagetest.org. But dont bang your head against this one too much, as it basically says how fast your web server can serve the content. So probably not much you can do here form the software end.
Hope that would help!
NopCommerce is deadly slow, and the developers doesn't look in to the performance issue seriously. I have seen lot of performance related forums left unanswered. So best luck.
I am trying to generate a log of page load times (time from first byte of HTML to the onload event) from my actual browsing habits. An ideal solution would produce, after performing a google search for example, a log something like this:
http://google.com/ 523
https://www.google.com/search?q=asdf 1003
Where the pages took 523ms and 1003 to load.
A firefox or chrome extension that works on linux or mac would be ideal, as I'm trying to track in the context of normal everyday browsing.
If you install the NetExport Firebug extension, it will allow you to export all collected and computed data from the Firebug Net panel as a HTTP Archive (HAR) format file (based on JSON).
You can then view this file that includes load times using HAR Viewer or other tools.
Gavin,
Try using the FireBug Plugin in FireFox. http://getfirebug.com/
It shows you when the file started and stopped loading relative to all the other files.
Worth checking out.
Regards,
Ninad
You need Fiddler. http://www.fiddlertool.com
Another great tool which does what you requested in the comments is dynatrace ajax. http://ajax.dynatrace.com/ajax/en/ It hooks into the browser and keeps metrics on actual render and js times. http://ejohn.org/blog/deep-tracing-of-internet-explorer/
We need to display ~40 images in a page and not allow users to hot link those images. We are currently using <img src="..."> which points to a handler that checks the cgi.http_referer and display the images using cfcontent. However, some images will fail to load (~6 images out of 40), if I refresh the page, some other images will fail to load.
This problem seems to appear when I have to display more than 10 images. I suppose this is because I'm using cfcontent? If so, what should I use instead?
To find out exactly why those images are failing, you'll need to do a little more work. You should use something like Firebug in FireFox, or the console in Safari or Chrome, to find out what's happening with those requests that are failing. You can also use something like Fiddler on Windows for IE or Charles on the Mac, Windows, or Linux to see the full HTTP requests that are happening in the background, along with the full return values from your ColdFusion app server. Until you know exactly why they're failing, we can't come up with any sort of solution.
The other thing to remember, is that if you do this via ColdFusion, then for every page load, you're hitting your CF server with 40 more requests. So one page then results in 41 hits to your CF server for processing. Make sure that code is as tight as it can possibly be.
If I was going to go this route, I'd do it at the server level (IIS or Apache) using some sort of server-level filter to prevent the hotlinking. But just remember, that there will always be a way around it.
I use the Kohana3's Profiler class and its profiler/stats template to time my website. In a very clean page (no AJAX, no jQuery etc, only load a template and show some text message, no database access), it shows the request time is 0.070682 s("Requests" item in the "profiler/stats" template). Then I use two microtime() to time the duration from the first line of the index.php to the last line of index.php, it shows almost very fast result. (0.12622809410095 s). Very nice result.
But if i time the request time from the browser's point of view, it's totally different. I use Firefox + Temper data add-on, it shows the duration of the request is 3.345sec! And I noticed that from the time I click the link to enter the website (firefox starts the animated loading icon), to when the browser finish its work(the icon animation stops), it really takes 3-4 seconds!!
In my another website which is built with WikkaWiki, the time measured by Temper Data is only 2190ms - 2432ms, including several access to mysql database.
I tried a clean installation of kohana, and the default plain hello-world page also loads 3025ms.
All the website i mentioned here are tested in the same "localhost" PC, same setting. Actually they are just hosted in different directories in the same machine. Only Database module is enabled in the bootstrap.php for kohana website.
I'm wondering why the kohana website's overall response is such slow while the php code execution time is just 0.126 second?? Are there anything I should look into?
==Edit for additional information ==
Test result on standard phpinfo() page is 1100-1200ms (Temper data)
Profiler shows you execution time from Kohana initialization to Profiler render call. So, its not a full Kohana time. Some kind of actions (Kohana::shutdown_handler(), Session::_destroy() etc) may take a long time.
Since your post confirms Kohana is finishing in a 1/10th of a second and less, it's probably something else:
Have you tested something else other than Kohana? It sounds like the server is at fault, but you can't be sure unless you compare the response times with something else. Try a HTML and pure PHP page.
The firefox profiler could be taking external media into consideration. So if you have a slow connection and you load Google Analytics, then that could be another problem.
Maybe there is something related with this issue: Firefox and Chrome slow on localhost; known fix doesn't work on Windows 7
Although the issue happens in Windows 7, maybe it can help...
I am a complete novice to Flash (never created anything in flash). I am quite familiar with web applications (J2EE based) and have a reasonable expertise in Javascript.
Here is my requirement.
I want the user to select (via an html form) an image. Normally in the post, this image would be sent to server and may be stored there to be served later. I do not want that. I want to store this image locally and then serve it via HTTP to the user.
So, the flow is:
1. Go to the "select image url":mywebsite.com/selectImage
Browse the image and select the image
This would transfer control locally to some code running on the client (Javascript or flash), which would then store the image locally at some place on the client machine.
Go to the "show image url": mywebsite.com/showImage
This would eventually result in some client code running on the browser that retrieves the image and renders it (without any server round trips.)
I considered the following options:
Use HTML5 local storage. Since I am a complete novice to flash, I looked into this. I found that it is fairly straightforward to store and retrieve images in javascript (only strings are allowed but I am hoping storing base64 encoded strings would work at least for small images). However, how do I serve the image via http url that points to my server without a server round trip? I saw the interesting article at http://hacks.mozilla.org/category/fileapi/ but that would work only in firefox and I need to work on all latest browsers (at least the ones supporting HTML5 local storage)
Use flash SharedObjects. OK, this would have been good - the only thing is I am not sure where to start. Snippets of actionscripts to do this are scattered everywhere but I do not know how to use those scripts in an actual html page:) I do not need to create any movies or anything - just need to store an image and serve it locally. If I go this route, I would also use it to store other "strings" locally. If you suggest this, please give me the exact steps (could be pointers to other web sites) on how to do this. I would like to avoid paying for any flash development environment software ideally:)
Thank you!
You could use a data URI to display the file. Essentially you use the image data (plus a prefix) as the src attribute of an image element. If you already figured out how to read the file into memory as a base64 encoded string, using a data URI would probably be the easiest way to display the image.
OK
I was able to implement the following solution (just in case anyone has any comments or would like to know the answer)
wrote server side code that takes an image and returns its base 64 encoded version.
used the hidden iframe trick to get the base 64 encoded data into an iframe and stored it into the image by dynamically changing the image source to the data uri
For the "hidden iframe trick" - in case you are interested, there is a good article at (see www.openjs.com/articles/ajax/ajax_file_upload/response_data.php)
The only limitation is that IE does not work with images whose base 64 encoded string exceeds 32K - see http://msdn.microsoft.com/en-us/ie/dd578309.aspx. Note that only IE 8 works - IE 7 does not support data uri I believe.