I am contemplating between using a grid plugin for Jquery vs manually adding rows to the html table (using Jquery). All I need to do is display the data in a table, have one field editable, then save the data to the database. i have a limited deadline and don't have the time to learn a new plugin (such as jqgrid which is quite complex).
I would normally display around 200 rows to the user..what I am wondering about is in terms of speed would it be really poor performance to add row to the html table 200 times? Would a plugin really speed up the performance (hence making it almost necessary for me to use one)? I know JavaScript can be slow when not optimized which is why I would like to know.
Any thoughts/advice?
There is nothing that a plugin does that would necessarily be faster than what you can write yourself.
That being said, the fastest way to do this for you would be to create a string of the HTML table rows (append each row to the string) and then to set the innerHTML to the string. Don't build the DOM nodes directly & append, that is the worst performance.
Source: http://www.quirksmode.org/dom/innerhtml.html
Look at examples from the answer. In the example will be added 1000 lines to the grid and all work quickly.
It would be much better if you posted the prototype of your grid which you currently use. Moreover jqGrid support many scenarios for local and remote data and many ways of editing local and remote data. Do you choosed already one way or at least direction in which you want to go? If you plan to access remote backend server having database, more information is required. At least one need to know which technology you use on the server (ASP.NET MVC, WFC, ASMX web services, PHP, Java Servlet and so on).
Related
I need an advise, I am using Jqgrid as a grid tool, somehow I am finding some complication of showing very big data I am talking of 450 records with big data and 10 columns in there.
Is there any better grid you guys suggest to work with which gives me a better performance
I've always used dataTables and i always reccomend it because it's easy to use and configure. It is perfect and fast to display data ( i used it with tables with more than 100.000 rows with no problems, configuring server side procesiing correctly ).
The only thing you must now is that (as far as i know) it doesn't support colspan in the body of the table so, if your layouts require that, using datatables becomes impossible. (i usually found other way to show things rather than using colspan but for some this is a blocker)
,This question is likely subjective, but a lot of "grid" Javascript plugins have come out to help paginate and sort tables. They usually work in 2 ways, the first and simplest is that it takes an existing HTML <table> and converts it into a sortable and searchable information. The second is that it passes info to the server and has the server select info from the database to be displayed.
My question is this: At what point (size wise) is it more efficient to use server-side processing vs displaying all the data and have the "grid plugin" convert it to a sortable/searchable table client-side?
Using datatables as an example, I have to execute at least 3 queries to get total rows in the table, total filtered results for pagination, and the filtered results to be displayed for the specific selected page. Then every time I sort, I am querying again. Every time I move to another page, or search in the table, more queries.
If I was to pull the data once when the client visits the page, I would be executing a single query, and then formatting and pushing the results to the client all at once. This increases the page size, and possibly delays loading of the page once it gets too big. The upside is there will only one query, and all the sorting, searching, and pagination is handled by the plugin, so no waiting for a response and no more queries.
If I was to have just a few rows, I imagine just pushing the formatted table data to the client at the page load would be the fastest. But with thousands of rows, switching to server-side would be the most efficient way.
Where is the tipping point? Is there a tipping point, or is server-side or client-side the way to go 100% of the time?
The answer on your question can be only subjective. So I explain how I personally understand the problem and give me recommendation.
In my opinion the data with 2-3 row and 3-4 column can be displayed in HTML table without usage any plugin. The data you display for the user the more important will be that the user will be able to grasp the information which will be displayed. So I think that the information for example have to be good formatted and marked with colors and icons for example. This with help to grasp information from probably 10 rows of data, but not much more. If you just display table with 100 rows or more then you overtax the user. The user will have to analyse the data to get any helpful information from the table. Scrolling of the data makes this not easier.
So I think that one should give the user comfortable or at least convenient interface to sort and to filter the data from the table. The exact interface is mostly the matter of taste. For example the grid can have an additional filter bar
For filtering and even for sorting of the data it's important to have not pure strings, but to be able to distinguish the data types like integer (10 should be after 9 and not between 1 and 2), numbers (correct interpret '.' and ',' inside of numbers), dates (3/20/2012 should be grater as 4/15/2010) and so on. If you just convert HTML table to some grid you will have problems with correct filtering or sorting. Even if you use pure local JavaScript data to display in grid it would be important to have datasource which has some kind of type information and then to create the grid based in the data. In the case you can gives date as JavaScript Date or as ISO 8601 string "2012-03-20" and in the grid display the data corresponds the specified formatter as 3/20/2012 or 20-Mar-2012.
Whether you implement filtering, sorting and paging on the server side or on the client side is not really important for the user who open the page. It's important only that all works quickly enough. The exact choose of the grid plugin, the filtering (with filter toolbar or external controls) and styling of the grid depend on your taste and the project requirements.
I'm setting up a jqGrid (in a Google Chrome Extension) which will handle local JSON data.
My concern is performance due to my unique use case. I have thousands of records getting dynamically generated on the client side over a few minutes and I can't wait for the data to be generated so currently I add this data to the grid row by row using 'addRowData'.
But the problem is, when I'm adding data to the grid I have to check if that data already exists and if it does I need to update the existing record. I'm just having trouble understanding the best way to accomplish this, is the only way I can search the grid by calling 'getCol' and then searching the array. My concern with calling getCol is I presume this searches the DOM? But I could be wrong, I have scroll: 1 set and I'm starting to think this might mean its pulling data directly from an array?
Or maybe I should be implementing this a totally different way? It would of been so much easier if I could of just inserted all of this data into an array and then loaded the grid but due to the time taken to generate the data the user needs to see it ASAP.
I am using the latest version of Telerik MVC extensions, ASP.NET MVC 3 with the Razor view engine. I am using entiry framework 4.1 code first with no stored procedures.
I worked through the example at http://demos.telerik.com/aspnet-mvc/grid/paging and I'm not sure if this is what I am looking for.
I am trying to implement paging. The amount of rows on my grid is 50. When the grid loads for the first time it must do a database table call and fetch the first 50 records. When you go to the next 50 rows, then it must return the next set of 50 records.
The sample ueses view data, I'm not comfortable using view data. It's not safe? Isn't there a decent example to be used on the net?
Also, if I have loaded the first 50 records, and I go to the next page, is there a way of caching the previous records so that it is there?
Implementing a caching solution for Entity framework is completely your own choice, it is definitely possible although I'm becoming less convinced of the value in doing this.
You don't need to use ViewData to provide data to the Telerik grid and one of the huge bonuses of using their grid is that if you have an IQueryable<T> data source it will automagically provide paging, sorting, filtering functionality straight out of the box.
You didn't state if you are using server or client binding, so i haven't attempted to write any code.
Sounds like you might be using server-side binding though.
Looking for a bit of advice on how to optimise one of our projects. We have a ASP.NET/C# system that retrieves data from a SQL2008 data and presents it on a DevExpress ASPxGridView. The data that's retrieved can come from one of a number of databases - all of which are slightly different and are being added and removed regularly. The user is presented with a list of live "companies", and the data is retrieved from the corresponding database.
At the moment, data is being retrieved using a standard SqlDataSource and a dynamically-created SQL SELECT statement. There are a few JOINs in the statement, as well as optional WHERE constraints, again dynamically-created depending on the database and the user's permission level.
All of this works great (honest!), apart from performance. When it comes to some databases, there are several hundreds of thousands of rows, and retrieving and paging through the data is quite slow (the databases are already properly indexed). I've therefore been looking at ways of speeding the system up, and it seems to boil down to two choices: XPO or LINQ.
LINQ seems to be the popular choice, but I'm not sure how easy it will be to implement with a system that is so dynamic in nature - would I need to create "definitions" for each database that LINQ could access? I'm also a bit unsure about creating the LINQ queries dynamically too, although looking at a few examples that part at least seems doable.
XPO, on the other hand, seems to allow me to create a XPO Data Source on the fly. However, I can't find too much information on how to JOIN to other tables.
Can anyone offer any advice on which method - if any - is the best to try and retro-fit into this project? Or is the dynamic SQL model currently used fundamentally different from LINQ and XPO and best left alone?
Before you go and change the whole way that your app talks to the database, have you had a look at the following:
Run your code through a performance profiler (such as Redgate's performance profiler), the results are often surprising.
If you are constructing the SQL string on the fly, are you using .Net best practices such as String.Concat("str1", "str2") instead of "str1" + "str2". Remember, multiple small gains add up to big gains.
Have you thought about having a summary table or database that is periodically updated (say every 15 mins, you might need to run a service to update this data automatically.) so that you are only hitting one database. New connections to databases are quiet expensive.
Have you looked at the query plans for the SQL that you are running. Today, I moved a dynamically created SQL string to a sproc (only 1 param changed) and shaved 5-10 seconds off the running time (it was being called 100-10000 times depending on some conditions).
Just a warning if you do use LINQ. I have seen some developers who have decided to use LINQ write more inefficient code because they did not know what they are doing (pulling 36,000 records when they needed to check for 1 for example). This things are very easily overlooked.
Just something to get you started on and hopefully there is something there that you haven't thought of.
Cheers,
Stu
As far as I understand you are talking about so called server mode when all data manipulations are done on the DB server instead of them to the web server and processing them there. In this mode grid works very fast with data sources that can contain hundreds thousands records. If you want to use this mode, you should either create the corresponding LINQ classes or XPO classes. If you decide to use LINQ based server mode, the LINQServerModeDataSource provides the Selecting event which can be used to set a custom IQueryable and KeyExpression. I would suggest that you use LINQ in your application. I hope, this information will be helpful to you.
I guess there are two points where performance might be tweaked in this case. I'll assume that you're accessing the database directly rather than through some kind of secondary layer.
First, you don't say how you're displaying the data itself. If you're loading thousands of records into a grid, that will take time no matter how fast everything else is. Obviously the trick here is to show a subset of the data and allow the user to page, etc. If you're not doing this then that might be a good place to start.
Second, you say that the tables are properly indexed. If this is the case, and assuming that you're not loading 1,000 records into the page at once and retreiving only subsets at a time, then you should be OK.
But, if you're only doing an ExecuteQuery() against an SQL connection to get a dataset back I don't see how Linq or anything else will help you. I'd say that the problem is obviously on the DB side.
So to solve the problem with the database you need to profile the different SELECT statements you're running against it, examine the query plan and identify the places where things are slowing down. You might want to start by using the SQL Server Profiler, but if you have a good DBA, sometimes just looking at the query plan (which you can get from Management Studio) is usually enough.