Simple pagination in datatable using ajax without sending total count from server - datatable

I'm using DataTables 1.10.5. My table uses server side processing via ajax.
$('#' + id).dataTable({
processing: true,
serverSide: true,
ajax: 'server-side-php-script-url',
"pagingType": "simple_incremental_bootstrap"
});
Everything will work properly if I send 'recordsTotal' in the server response. But I don't want to count the total entries because of performance issues. So I tried to use the pagination plugin simple_incremental_bootstrap. However it is not working as expected. The next button always return the first page itself. If I give 'recordsTotal' in server response this plugin will work properly. I found out that If we don't give 'recordsTotal', the 'start' param sent by datatable to server side script is always 0. So my server side script will always return the first page.
According to this discussion, server side processing without calculating total count is not possible because “DataTables uses the record count that is passed back to it to deal with the paging controls”. The suggested workaround is “So the display records are needed, but it would be possible to just pass back a static number (like 1'000'000 or whatever) which would make DataTables think there are a million rows. You could hide the information element if this information is totally bogus!”
I wonder if anybody have a solution for this. Basically I want to have a simple pagination in my datatable with ajax without sending total count from server.

A workaround worth to try..
If we don't send recordsTotal from server, the pagination won't work properly. If we send a high static number as recordsTotal, table will show an active Next button even if there is no data in next page.
So I ended up in a solution which utilizes two parameters received in ajax script - 'start' and 'length'.
If rows in current page is less than 'limit' there is no data in next page. So total count will be 'start' + 'current page count'. This will disable Next button in the last page.
If rows in current page is equal to or greater than 'limit' there is more data in next pages. Then I will fetch data for next page. If there is at least one row in next page, send recordsTotal something larger than 'start + limit'. This will display an active Next button.
Sample code:
$limit = require_param('length');
$offset = require_param('start');
$current_page_data = fn_to_calculate_data($limit, $offset); // in my case, mysqli result.
$data = “fetch data $current_page_data”;
$current_page_count = mysqli_num_rows($current_page_data);
if($current_page_count >= $limit) {
$next_page_data = fn_to_calculate_data($limit, $offset+$limit);
$next_page_count = mysqli_num_rows($next_page_data);
if($next_page_count >= $limit) {
// Not the exact count, just indicate that we have more pages to show.
$total_count = $offset+(2*$limit);
} else {
$total_count = $offset+$limit+$next_page_count;
}
} else {
$total_count = $offset+$current_page_count;
}
$filtered_count = $total_count;
send_json(array(
'draw' => $params['draw'],
'recordsTotal' => $total_count,
'recordsFiltered' => $filtered_count,
'data' => $data)
);
However this solution adds some load to server as it additionally calculate count of rows in next page. Anyway it is a small load as compared to the calculation total rows.
We need to hide the count information from table footer and use simple pagination.
dtOptions = {};
dtOptions.pagingType = "simple";
dtOptions.fnDrawCallback = function() {
$('#'+table_id+"_info").hide();
};
$('#' + table_id).dataTable(dtOptions);

Related

Web.Content calling API service and merging pages with List.Transform started to fail

I created PowerBI report which which is connecting to data source via API service. Returning json contains thousands of entities. API service is called via Web.Content function. API service returns always total record count and so we are able to calculate nr. of pages which has to be called to obtain whole dataset. This report is displaying data from our servicedesk app, which is deployed on many servers and for many customers and use Query parameters to connect to any of these servers.
Detail of Power query is below.
Why am I writing here. This report was working without any issue more than 1,5 year but on August 17th one of servers start causing erros in step Pages where are some random lines (pages) with errors - see attached picture labeled "Errors in step Pages". and this is reason that next step Entities (List.Union) in query is stopping refresh and generate errors with message:
Expression.Error: We cannot apply field access to the type List. Details: Value=[List] Key=requests
What is notable
API service si returning records in the same order but faulty lists are random when calling with same parameters
some times is refresh without any error
The same power query called on another server is working correctly , problem is only with one specific server.
This problem started without notice on the most important server after 1,5 year without any problem.
Here is full text power of query for this main source, which is used later in other queries to extract all necessary data. Json is really complicated and I extract from it list of requests, list of solvers, list of solver groups,.... and this base query and its output is input for many referenced queries.
Errors in step Pages
let
BaseAPIUrl = apiurl&"apiservice?", /*apiurl is parameter - name of server e.g. https://xxxx.xxxxxx.sk/ */
EntitiesPerPage = RecordsPerPage, /*RecordsPerPage is parameter and defines nr. of record per page - we used as optimum 200-400 record per pages, but is working also with 4000 record per page*/
ApiToken = FnApiToken(), /*this function is returning apitoken value which is returning value of another api service apiurl&"api/auth/login", which use username and password in body of call to get apitoken */
GetJson = (QParm) => /*definiton general function to get data from data source*/
let
Options =
[ Query= QParm,
Headers=
[
Accept="application/json",
ApiKeyName="apitoken",
Authorization=ApiToken
]
],
RawData = Web.Contents(BaseAPIUrl, Options),
Json = Json.Document(RawData)
in Json,
GetEntityCount = () => /*one times called function to get nr of records using GetJson, which is returned as a part of each call*/
let
QParm = [pp="1", pg="1" ],
Json = GetJson(QParm),
Count = Json[totalRecord]
in
Count,
GetPage = (Index) => /*repeatadly called function to get each page of json using GetJson*/
let
PageNr = Text.From(Index+1),
PerPage = Text.From(EntitiesPerPage),
QParm = [pg = PageNr, pp=PerPage],
Json = GetJson(QParm),
Value = Json[data][requests]
in Value,
EntityCount = List.Max({ EntitiesPerPage, GetEntityCount() }), /*setup of nr. of records to variable*/
PageCount = Number.RoundUp(EntityCount / EntitiesPerPage), /*setup of nr. of pages */
PageIndices = { 0 .. PageCount - 1 },
Pages = List.Transform(PageIndices, each GetPage(_) /*Function.InvokeAfter(()=>GetPage(_),#duration(0,0,0,1))*/), /*here we call for each page GetJson function to get whole dataset - there is in comment test with delay between getpages but was not neccessary*/
Entities = List.Union(Pages),
Table = Table.FromList(Entities, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
I also tried another way of appending pages to list using List.Generate. This is also bringing random errors in list but
it is bringing possibility to transform to table in contrast with original way with using List.Transform, but other referenced queries are failing and contains on the last row errors
When I am exploring content of faulty page/list extracting it via Add as New Query there are always all record without any fail.....
Source = List.Generate( /*another way to generate list of all pages*/
() => [Page = 0, ReqPageData = GetPage(0) ],
each [Page] < PageCount,
each [ReqPageData = GetPage( [Page] ),
Page = [Page] + 1 ],
each [ReqPageData]
),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error), /*here i am able to generate table from list in contrast when is used List.Generate*/
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"), /*here aj can expand list to column*/
#"Removed Errors" = Table.RemoveRowsWithErrors(#"Expanded Column1", {"Column1"}) /*here i try to exclude errors, but i dont know what happend and which records (if any) are excluded*/
Extracting errored page
and finnaly I am tottaly clueless not able to find the cause of this behavior on this specific server. I tested to call pages which are errored via POSTMAN, I discused this issue with author of API service and He also tried to call this API service with all parameters but server is returning every page OK, only Power query is not able to List.Transform ...
I will be grateful and appreciate any tips or advice or if somebody solved the same issue in the past ....
Kuby
No, each error line of list in step List.Transform coud by extracted as new query and there are all records from one page OK. hmmmm
Finnaly, problem described in this issue was caused by "corrupted" content of returning json. The provider of core system informed me that they found bug and after fixing on the side of servisdesk is everything OK again. I tried to find problem in Power query and problem was in servisdesk. :(

increment function increments by 2 not with 1

Every time user clicks post then that post view should be incremented by 1.
But instead of 1 it is incrementing by 2.
There are many some pages where this post can be clicked.
I have tried using increment function
$view = PostAd::where('id',$id)->first();
$view->increment('viewcount',1);
full code
$view = PostAd::find($id);
$view->increment('viewcount',1);
$data['ads'] = PostAd::find($id);
$data['post']= PostAd::with('postimage')->where('id',$id)->get();
$data['postimage'] = PostAd::with('pimage')->where('id',$id)->get();
$data['details']= PostAd::with('category')->where('id',$id)->first();
$data['comments'] = Comment::where('post_id',$id)->get();
$data['favourite'] = Favourite::where('post_id',$id)->first();
$data['identify'] = PostAd::with(['category','category.children'])->get();
Answer is simple. Use this...
$view->increment('viewcount');
Increment by a custom count (COUNT)
$view = PostAd::where('id',$id)->first();
$view->increment('viewcount',COUNT);
Ex: Increment by 5
$view->increment('viewcount',5);
Read more here
You can use default 1 increment.
$query->increment('viewcount');
If you need custom increment, use
$query->increment('viewcount',increment_value);
OR
PostAd::where('id', $id)
->update('viewcount' => DB::raw('viewcount + 1'));
If you use pagination with json, for example, after it loads the page request, a new request will be created to add ?page=1.
Try to keep the pagination code only in the view that requires pagination. Create a .js file to be listed only on pages that require this functionality. Not globally.

Return laravel query results in chunks

I have an Update model in my Laravel/Vue.js app, instead of retrieving and displaying all results at once in my component, I want to return them in chunks of fives and place a NEXT FIVE UPDATES link in my component to display the next 5 update records, much like in pagination. I tried this:
$update = Update::limit(5)->get();
This does not achieve my desired result. Is there a Laravel method i can use in my laravel backend to send the results in chunks of 5 to my Vue.Js component and then make my NEXT FIVE UPDATES link display the next 5 records.
If you're able to send some kind of page or offset value to the backend then you could use laravel's skip and take methods: https://laravel.com/docs/5.7/queries#ordering-grouping-limit-and-offset
$skip = $request->page; //assuming this variable exists...
$limit = 5;
$update = Update::skip($skip * $limit)->take($limit)->get();

JQGrid server pagination not working

Client side pagination is working for me(that was very easy).
I have a method in the server side that will accept pagenumber and number of records.
When I display the grid for the first time, for example I get 15 records, I also set the pagenumber, totalpages. if I have 40 records, it should say "view 1-15 of 40", "page 1 of 3".
I set the attributes like this.
$("#sampleGrid").jqGrid({
loadonce:false,
page: 1,
rowNum: 15,
TotalPages: 3,
onPaging: {
if(pgbutton == "next_gridpager"){
//call the server side method. pass pagenumber and number of records as parameter
}
else if(pgbutton == "prev_gridpager")
{
//call server side method to get data
}
});
The problem is even though I specify the page to be displayed and Totalpages, it shows only first page. how do i tell jqgrid that it is not client side pagination and please set the totalpages to 3.
You provide too little information about your problem. What data you received from the server? It must look like:
{"total":3,"page":1,"records":40,"rows":[your data...]}.
You don't need to use parameter TotalPages and onPaging event in jqGrid config, if reply from server is correct, pagination will work, to enable server side pagination you need to set datatype and url parameter

alternative to jqgrid triggerToolbar on a local dataset

I have a jqgrid displaying a large amount data. The data is retrieved from the server periodically via a jquery ajax call (outside of the jqgrid logic). The retrieved data is compared to the data previously retried ( and stored as a var in js. it is served as the data for the jqgrid). if they are different the local data is refreshed, then trigger the jqgrid to reload. the jqgrid datatype is jsonstring.
This solution is working quite well, except when the user have a filter value in the filter toolbar. because i set a timer for 0.1 sec to trigger filter in the loadcomplete event, the whole grid refresh when there is a filter string looks like this:
20 records were displayed in the jqgrid originally (because the user is filtering on a certain value in a column)
jqgrid is refreshed, because the data polled from the server by the separated ajax call is different than the one stored in the browser
jqgrid will show all the new data for a very short period of time
jqgrid filter is triggered within loadcomplete. and the screen is showing 20 records again.
it is technically still working. but is there a way to re-apply the filter locally on the jsonstring before the grid is visualised? put in a different way, can the jqgrid to visualise only once, which will both load the new jsonsting and apply the filter that was placed in the filter box before at the same time?
thanks
casbby
update:
I have tried one of Oleg's solution to apply filter while reloading the grid. this is demo. it worked perfectly as long as the the datatype is local. my page actually use the datatype jsonstring to reload the grid. This function from the code does seem to apply to jsonstring. i was hoping to call such a function after a external jquery ajax successfully retrieved the data form the server.
function filter() {
var searchFiler = $("#filter").val(), f;
if (searchFiler.length === 0) {
grid[0].p.search = false;
$.extend(grid[0].p.postData,{filters:""});
}
f = {groupOp:"OR",rules:[]};
f.rules.push({field:"name",op:"cn",data:searchFiler});
f.rules.push({field:"note",op:"cn",data:searchFiler});
grid[0].p.search = true;
$.extend(grid[0].p.postData,{filters:JSON.stringify(f)});
grid.trigger("reloadGrid",[{page:1,current:true}]);
}
can someone please help me out? many thanks.
There are small differences in the usage of datatype: "jsonstring" in comparing with the usage of datatype: "local". You can compare the corresponding parts of the code here. One from the differences in the code of datatype: "local" is the usage of addLocalData and populateVisible functions. The last function (populateVisible) will be used only in case of virtual scrolling (scroll: 1 or scroll: true). In your case important difference between datatype: "jsonstring" and datatype: "local" is the call of addLocalData in case of datatype: "local".
The function addLocalData apply grouping and filtering of local data (see here). Moreover it cut the list of displayed rows to the current page (see here).
So if the server returns unfiltered data and you need display filtered data then you should use datatype: "local" instead of datatype: "jsonstring". Instead of datastr you should use data. You can need to use localReader instead of jsonReader (see the documentation) or just convert manually the data returned from the server to the format which could be read by default localReader.
UPDATE: In another answer I described and included the demo which shows how localReader can be used.
What you can alternatively do is to convert your input data returned from the server to the standard format (or return the data from the server in the format). The data parameter should be array of named objects with properties like the column names in colMode. So what you can do is just a simple loop through rows array and creating another array in the format which jqGrid required. The corresponding code could be about the following:
// let us you have myImput with myImput.rows
// and you have cm which you use as the value of colModel parameter
var mydata = [], input = myImput.rows, l = input.length,
cmLength = cm.length, i, j, inputItem, item;
for (i = 0; i < l; i++) {
inputItem = input[i];
item = {id: inputItem.id};
inputItem = inputItem.cell;
for (j = 0; j < cmLength; j++) {
item[cm[j].name] = inputItem[j];
}
mydata.push(item);
}
After such conversion you can use mydata array as the value of data parameter.

Resources