How to save multiple rows on data grid - linq

Let's say I have ten rows on a data grid and I have changed data of three rows.
I am trying to save data by Linq for those three rows, but I am not sure how it is possible to save them.
I can do it by looping whole rows by checking each row for any change.
Is there any smarter way to save multiple data rather than looping by code, such as For Next.

As long as the Data Context is kept alive while the changes are made in the grid, you can save all of the changes by calling SubmitChanges() using the Unit of Work pattern. This works fine in stateful (Winform/WPF) implementations. In web applications, the typical UI models only allow for editing a single row per page submission. In that case, the challenge becomes how to allow editing multiple records in a single page request. If you can do that, then you can batch up your updates and only call SubmitChanges once per page request.

Related

Disposing the BindingSource of a ComboBox is extremely slow...

I have a main table with 4 lookup tables. All tables are bound to SQL tables accessed via stored procedures. Each lookup table has between 3 and 27 rows. The main table has about 22,000 rows (pet food & products).
My DataGridView displays rows of the main table with the lookup columns configured as ComboBoxes. I can insert, delete, edit the main table, ... all of these functions work fine. The initial loading of the form and inserting a new row each take about a second but this is not concerning. All other operations are relatively fast.
Here is a screen shot of the DataGridView:
The problem comes when I close the form...and only after having inserted one or more rows into the main table. The closing of the form can take up to a minute. In the closing of the form I now dispose of the binding sources for the 5 tables myself so that I can time them. Disposing of the binding source for the 4 lookup tables routinely takes 10-15 seconds per table. Closing the binding source for the main table takes no time at all. Again, this only happens after inserting a new row into the main table. I can edit main table rows, change lookup column values, and delete rows and closing the form in those use cases is instant.
I have tried running the program within VS, outside of VS running a debug EXE, and running outside of VS running a release version of the EXE, all with similar results.
What can I do to prevent this long delay disposing of the ComboBox binding sources? Is this typical and are there alternatives I should be considering?
After three days of pounding my head against the wall trying all kinds of unsuccessful resolutions, I rebuilt the application from scratch. That fixed the problem and I believe I discovered the cause. At first I just created the data set with all the table adapters, which was pretty fast, and then I added a single form and grid to mimic the condition I described above above. Testing confirmed no issues at all so I continued added more forms with the same ComboBox look-ups and it continues to work fine. I am pretty sure that there was something screwy in the data set definitions I had previously. Hopefully this helps someone in the future.

Laravel pagination in Data Table

I am using DataTable plugin in Laravel. I have a record of 3000 entries in some
But when i load that page it loads all 3000 records in the browser then create pagination, this slow down the page loading.
How to fix this or correct way
Use server-side processing.
Get help from some Laravel Packages. Such as Yajra's: https://yajrabox.com/docs/laravel-datatables/
Generally you can solve pagination either on the front end, the back end (server or database side), or a combination of both.
Server side processing, without a package, would mean setting up TOP/FETCH or make rows in data being returned from your server. 

You could also load a small amount (say 20) and then when the user scrolls to the bottom of the list, load another 20 or so. I mention the inclusion of front end processing as well because I’m not sure what your use cases are, but I imagine it’s pretty rare any given user actually needs to see 3000 rows at a time.

Given that Data Tables seems to have built-in functionality for paginating data, I think that #tersakyan is essentially correct — what you want is some form of back-end filtering or paginating of rows of data to limit what’s being sent to the front end.

I don’t know if that package works for you or not or what your setup looks like, but pagination can also be achieved directly from a DataBase returning data via the SQL (using TOP/FETCH for example) or could be implemented in a Controller or Service by tracking pages of data and “loading a page at a time” both from the server and then into the table. All you would need is a unique key to associate each "set of pages" for a specific request.
But for performance, you want to avoid both large data requests and operations on large sets of data. So the more you limit how much data is being grabbed or processed at any stage of your application using it, the more performant your application will be in principle.




Can Tablesorter ajax filter my JSON-populated table?

I have a table that I want to page/sort with Tablesorter, and I have passed the data to the table with JSON. I would like it to add/delete/update rows as rows are added etc. in the JSON. However, the filtering also happens serverside, which means a several-second lag every time you change pages or want to search, as the request is made to a PHP page. When you have potentially several thousand rows it can get tedious.
Is it at all possible to get Tablesorter to load all rows first, and page and filter those rows via AJAX, in the same way as it does by default (ie without sending requests back and forth)? And yet still update the JSON as it changes? Or will I have to do all requests server-side as the data is being fetched dynamically? Is there any way to speed up the calls, alternatively?

Grid - When should you switch from html to server side table processing?

,This question is likely subjective, but a lot of "grid" Javascript plugins have come out to help paginate and sort tables. They usually work in 2 ways, the first and simplest is that it takes an existing HTML <table> and converts it into a sortable and searchable information. The second is that it passes info to the server and has the server select info from the database to be displayed.
My question is this: At what point (size wise) is it more efficient to use server-side processing vs displaying all the data and have the "grid plugin" convert it to a sortable/searchable table client-side?
Using datatables as an example, I have to execute at least 3 queries to get total rows in the table, total filtered results for pagination, and the filtered results to be displayed for the specific selected page. Then every time I sort, I am querying again. Every time I move to another page, or search in the table, more queries.
If I was to pull the data once when the client visits the page, I would be executing a single query, and then formatting and pushing the results to the client all at once. This increases the page size, and possibly delays loading of the page once it gets too big. The upside is there will only one query, and all the sorting, searching, and pagination is handled by the plugin, so no waiting for a response and no more queries.
If I was to have just a few rows, I imagine just pushing the formatted table data to the client at the page load would be the fastest. But with thousands of rows, switching to server-side would be the most efficient way.
Where is the tipping point? Is there a tipping point, or is server-side or client-side the way to go 100% of the time?
The answer on your question can be only subjective. So I explain how I personally understand the problem and give me recommendation.
In my opinion the data with 2-3 row and 3-4 column can be displayed in HTML table without usage any plugin. The data you display for the user the more important will be that the user will be able to grasp the information which will be displayed. So I think that the information for example have to be good formatted and marked with colors and icons for example. This with help to grasp information from probably 10 rows of data, but not much more. If you just display table with 100 rows or more then you overtax the user. The user will have to analyse the data to get any helpful information from the table. Scrolling of the data makes this not easier.
So I think that one should give the user comfortable or at least convenient interface to sort and to filter the data from the table. The exact interface is mostly the matter of taste. For example the grid can have an additional filter bar
For filtering and even for sorting of the data it's important to have not pure strings, but to be able to distinguish the data types like integer (10 should be after 9 and not between 1 and 2), numbers (correct interpret '.' and ',' inside of numbers), dates (3/20/2012 should be grater as 4/15/2010) and so on. If you just convert HTML table to some grid you will have problems with correct filtering or sorting. Even if you use pure local JavaScript data to display in grid it would be important to have datasource which has some kind of type information and then to create the grid based in the data. In the case you can gives date as JavaScript Date or as ISO 8601 string "2012-03-20" and in the grid display the data corresponds the specified formatter as 3/20/2012 or 20-Mar-2012.
Whether you implement filtering, sorting and paging on the server side or on the client side is not really important for the user who open the page. It's important only that all works quickly enough. The exact choose of the grid plugin, the filtering (with filter toolbar or external controls) and styling of the grid depend on your taste and the project requirements.

jqGrid: Best practice for doing upsert like operations

I'm setting up a jqGrid (in a Google Chrome Extension) which will handle local JSON data.
My concern is performance due to my unique use case. I have thousands of records getting dynamically generated on the client side over a few minutes and I can't wait for the data to be generated so currently I add this data to the grid row by row using 'addRowData'.
But the problem is, when I'm adding data to the grid I have to check if that data already exists and if it does I need to update the existing record. I'm just having trouble understanding the best way to accomplish this, is the only way I can search the grid by calling 'getCol' and then searching the array. My concern with calling getCol is I presume this searches the DOM? But I could be wrong, I have scroll: 1 set and I'm starting to think this might mean its pulling data directly from an array?
Or maybe I should be implementing this a totally different way? It would of been so much easier if I could of just inserted all of this data into an array and then loaded the grid but due to the time taken to generate the data the user needs to see it ASAP.

Resources