IE performance for rendering huge html data - internet-explorer-8

I have a few perl CGIs which almost query the whole table with more than 5000 rows as result and send that data to browsers. The size of html data generated is around 1MB.
Earlier I was using tables(which should be ideal approach).
Unfortunately most users use IE and it does not display data till it receives closing table tag. Can we do something about it.
To push output as soon as its generated, I used another approach where in I was using printf and <pre>. Which reduced the response size by 200kb and and it appears more faster in display.
Again IE (not any other browser) eats up CPU and hangs for couple of seconds... :-( ..
Can we do something about it too.
FYI I am using IE8.

Perhaps it would be wise from a UX perspective to use some sort of pagination method. Having a single page with thousands upon thousands of rows sounds entirely unfriendly for the end user. Something like a simple means of pagination ("Skip to page =dropdown=") would certainly solve your problem, as well as decrease load times and increase usability.
There are also several solutions which are pre-built and would likely integrate rather easily. One which comes to mind almost immediately is Sencha's Paging Grid:
http://dev.sencha.com/deploy/dev/examples/grid/paging.html
http://www.sencha.com/products/js/
It's pretty nifty and you'll likely get some kudos for using a hip new technology. There are other options, too:
Ingrid: http://www.reconstrukt.com/ingrid/src/example3.html
YUI Data Grid: http://developer.yahoo.com/yui/datatable/#data
MyTableGrid: http://pabloaravena.info/mytablegrid/
Hope this helps!

More people use ie?
http://www.w3schools.com/browsers/browsers_stats.asp
anyways why do you still use tables?
we all know they are easy to maintain, easy to get confused and in your case slow...
output it as div which will show all the data and does not need to wait for a table tag to finish. as div and span have the least properties.

Related

What is the biggest performance issue in Emberjs?

Last 6 months I'm developing an Emberjs components.
I started had first performance problems when I tried develop a table component. Each cell in this table was Ember.View and each cell had a binding to object property. When the table had 6 columns and I tried to list about 100 items, it caused that browser was frozen for a while. So I discovered, that it is better write a function which return a string instead of using handlebars and handle bindings manually by observers.
So, is there any good practice how to use minimum of bindings? Or how to write bindings and not lose a lot of performance. For example... not use bindning for a big amount of data?
How many Ember.View objects is possible append to page?
Thanks for response
We've got an ember application that displays 100s of complex items on a single page. Each item uses out-of-box handlebars bindings to output about a dozen properties. It renders very quickly (< 100ms) and very little of that time is spent on bindings.
If you find the browser is hanging at 6 columns and 100 items there is for sure something else is wrong.
So, is there any good practice how to use minimum of bindings? Or how to write bindings and not lose a lot of performance.
Try setting Ember.LOG_BINDINGS = true to see what is going on. Could be the properties you are binding to have circular dependencies or costly to resolve. See this post for great tips on debugging:
http://www.akshay.cc/blog/2013-02-22-debugging-ember-js-and-ember-data.html
How many Ember.View objects is possible append to page?
I don't believe there is a concrete limit, for sure it will depend on browser. I wouldn't consider optimizing unless there were more than a few thousand.
Check out this example to see how ember can be optimized to handle very large tables:
http://addepar.github.com/ember-table/
Erik Bryn recently gave a talk about Ember.ListView which would drastically cut down the number of views on a page by reusing list cells which have gone out of the viewport.
Couple this technique with group-helper to reduce the number of metamorph script tags if your models multiple properties at the same time. Additionally, consider using {{unbound}} if you don't need any of the properties to live update.

How to extract or load a large array into listview?

I have an array of 50k items. I want to load all the items into my listview control as fast as possible. Using a loop is not the solution as a loop is very slow.
The underlying ListView control has a virtual mode which means your app only passes a count to the control and it then calls back periodically to get information on the entries that are visible. Unfortunatly, this functionality is not exposed by the VB6 common controls but you can still use the underlying control.
See this vbVision example.
There is no way to load in bulk as far as I know, but there are other tricks to make it a bit faster. One is to prevent the control from updating (repainting) during the load. This can be done as simply as hiding it while loading. Another technique is to load a chunk of records up front (say 2K) and then use a timer to load the rest in chunks in the background.
But honestly, I doubt the usefulness of a grid with 50K items displayed. That is too much data to present to a user in one pass. Have you considered refactoring your UI to limit the amount of data a user has to sift through at one time.

Need Recommendation on impression tracking

I'm doing a research for my work which needs to track impression of a little web app sitting in 3rd party (authorized) websites. I need to analyze the impression close to real time.
I know there are at least two ways
1) use image, and parse the server log for reporting.
2) js sends ajax, and save the request in DB. (either mysql or mongo or other noSQL).
so, which way is the faster way and can handle tones of traffic?
I suspect that server log is slower because it has to append to a file. But I'm not sure if it is really slower, or it is not.
So, what is the pros and cons of each approach? Thanks. :)
P.S. I can't use Google Analytics because there is a limit on Data Export..and also other limitations. :-)
Both options are valid, the image and server logs are simple and work as long as the visitor loads images. This is faster in most cases, since there is no extra processing.
If using JavaScript, I would do what the web-analytics companies do and create a image call with JS and at the other end have either a image file with server logs or a script reading data in to a DB and returning a 1x1 pixel transparent GIF.
If all you need is impressions, I would go with the simpler solution, less to go wrong.

can browsers [especially IE6] handle big Json object

I am returning a big Json object [ 5000 records and 10 elements per record]
from the controller [asp.net mvc] using Jquery and Ajaxpost. till now I was dealing with just 20 records [testing] and it is working fine. But in production there are 5000 records so i am wondering if browser can handle huge amount of data. Especially IE6. I have to display all the 5000 records on a single page. I am confused. I dont have that much data now to test. I need you expert advice whether to Use or Not to Use Jquery Json AjaxPost to return huge amount of data. Thank you.
You should really page this data. You might want to have a look at this:
PagedList
A simple way to find out and probably good practice is to make sure you have that data to test and it will make it much easier to choose between differnt options. paging is one option but there are others. You may be surprised how quickly modern browsers can load that amount of data.
it should be easy enough with a vb.net page to generate all the data you need using random data. This is really essential if you want to be able to investigate what to do.

Is it possible to modify a web page as it's loading?

I have a Perl script that generates a web page. It takes a non-trivial amount of time to run. I would like to be able to render a complete HTML table to the user so they know what results to expect, but fill in the details slowly as the Perl script generates them.
What approach should I be taking here?
My initial assumption was that I would be able to assign an ID to my various table data elements and then adjust their innerHTML properties as and when I got the results in. But it doesn't seem like I can perform such manipulations whilst the page is still loading.
There's no consistantly reliable way to modify a web page as it's loading.
You can create the effect by initially loading a compact loading page, and then loading the rest of the content via AJAX calls back to the server to get the individual components.
You can then load those components as your AJAX calls are completed.
EDIT
As the comments have pointed out...while this would achieve the results you want, it's a terrible idea.
Search Engine Indexing being the primary reason. You're also relying on Javascript to do a lot of heavy lifting...and it might not always be enabled.
One solution would be to progressively load the data via AJAX. You would need to do something like this:
Load the webpage
Using javascript, query the webserver for table values
Populate table with received values
Loop until table filled
Obviously this solution presents problems if the data is meant to be crawled. Since crawlers don't take into account dynamic data via javascript.
The other issue to consider is usability. Web Users are not used to this type of progressive loading, so informing them that the data is still being loaded would be very important. Also some type of accurate progress bar would provide good usability.
As suggested, AJAX is probably the way to go.
Create a basic HTML page with an empty div to hold your data, then using repeating AJAX calls, fill in the div.
This page describes how to do this:
link text

Resources