Room with PagingSource can't load data from center position of the data list - android-room

I'm using jetpack Paging3 with Room. I have a feature that I need to load data start from any page from the database.
for example:
I have a user table. and I defined the user dao like:
#Query("SELECT * FROM users ORDER BY id")
fun usersPagingSource():PagingSource<Int, User>
in view mode I use the dao method like:
Pager(config = PagingConfig(
pageSize = 20,
prefetchDistance = 5,
initialLoadSize = 20),
initialKey = 5,
pagingSourceFactory = { userDao.usersPagingSource() }
).flow.flowOn(Dispatchers.IO)
.cachedIn(viewModelScope)
The problem is the paging source always return the first page in database first time, but not the 6th page as I want.
If I want to load data from the 6th page data in database first time, what should I do?
Did I miss something about the paging api?

Related

Web.Content calling API service and merging pages with List.Transform started to fail

I created PowerBI report which which is connecting to data source via API service. Returning json contains thousands of entities. API service is called via Web.Content function. API service returns always total record count and so we are able to calculate nr. of pages which has to be called to obtain whole dataset. This report is displaying data from our servicedesk app, which is deployed on many servers and for many customers and use Query parameters to connect to any of these servers.
Detail of Power query is below.
Why am I writing here. This report was working without any issue more than 1,5 year but on August 17th one of servers start causing erros in step Pages where are some random lines (pages) with errors - see attached picture labeled "Errors in step Pages". and this is reason that next step Entities (List.Union) in query is stopping refresh and generate errors with message:
Expression.Error: We cannot apply field access to the type List. Details: Value=[List] Key=requests
What is notable
API service si returning records in the same order but faulty lists are random when calling with same parameters
some times is refresh without any error
The same power query called on another server is working correctly , problem is only with one specific server.
This problem started without notice on the most important server after 1,5 year without any problem.
Here is full text power of query for this main source, which is used later in other queries to extract all necessary data. Json is really complicated and I extract from it list of requests, list of solvers, list of solver groups,.... and this base query and its output is input for many referenced queries.
Errors in step Pages
let
BaseAPIUrl = apiurl&"apiservice?", /*apiurl is parameter - name of server e.g. https://xxxx.xxxxxx.sk/ */
EntitiesPerPage = RecordsPerPage, /*RecordsPerPage is parameter and defines nr. of record per page - we used as optimum 200-400 record per pages, but is working also with 4000 record per page*/
ApiToken = FnApiToken(), /*this function is returning apitoken value which is returning value of another api service apiurl&"api/auth/login", which use username and password in body of call to get apitoken */
GetJson = (QParm) => /*definiton general function to get data from data source*/
let
Options =
[ Query= QParm,
Headers=
[
Accept="application/json",
ApiKeyName="apitoken",
Authorization=ApiToken
]
],
RawData = Web.Contents(BaseAPIUrl, Options),
Json = Json.Document(RawData)
in Json,
GetEntityCount = () => /*one times called function to get nr of records using GetJson, which is returned as a part of each call*/
let
QParm = [pp="1", pg="1" ],
Json = GetJson(QParm),
Count = Json[totalRecord]
in
Count,
GetPage = (Index) => /*repeatadly called function to get each page of json using GetJson*/
let
PageNr = Text.From(Index+1),
PerPage = Text.From(EntitiesPerPage),
QParm = [pg = PageNr, pp=PerPage],
Json = GetJson(QParm),
Value = Json[data][requests]
in Value,
EntityCount = List.Max({ EntitiesPerPage, GetEntityCount() }), /*setup of nr. of records to variable*/
PageCount = Number.RoundUp(EntityCount / EntitiesPerPage), /*setup of nr. of pages */
PageIndices = { 0 .. PageCount - 1 },
Pages = List.Transform(PageIndices, each GetPage(_) /*Function.InvokeAfter(()=>GetPage(_),#duration(0,0,0,1))*/), /*here we call for each page GetJson function to get whole dataset - there is in comment test with delay between getpages but was not neccessary*/
Entities = List.Union(Pages),
Table = Table.FromList(Entities, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
I also tried another way of appending pages to list using List.Generate. This is also bringing random errors in list but
it is bringing possibility to transform to table in contrast with original way with using List.Transform, but other referenced queries are failing and contains on the last row errors
When I am exploring content of faulty page/list extracting it via Add as New Query there are always all record without any fail.....
Source = List.Generate( /*another way to generate list of all pages*/
() => [Page = 0, ReqPageData = GetPage(0) ],
each [Page] < PageCount,
each [ReqPageData = GetPage( [Page] ),
Page = [Page] + 1 ],
each [ReqPageData]
),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error), /*here i am able to generate table from list in contrast when is used List.Generate*/
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"), /*here aj can expand list to column*/
#"Removed Errors" = Table.RemoveRowsWithErrors(#"Expanded Column1", {"Column1"}) /*here i try to exclude errors, but i dont know what happend and which records (if any) are excluded*/
Extracting errored page
and finnaly I am tottaly clueless not able to find the cause of this behavior on this specific server. I tested to call pages which are errored via POSTMAN, I discused this issue with author of API service and He also tried to call this API service with all parameters but server is returning every page OK, only Power query is not able to List.Transform ...
I will be grateful and appreciate any tips or advice or if somebody solved the same issue in the past ....
Kuby
No, each error line of list in step List.Transform coud by extracted as new query and there are all records from one page OK. hmmmm
Finnaly, problem described in this issue was caused by "corrupted" content of returning json. The provider of core system informed me that they found bug and after fixing on the side of servisdesk is everything OK again. I tried to find problem in Power query and problem was in servisdesk. :(

Plugin performance in Microsoft Dynamics CRM 2013/2015

Time to leave the shy mode behind and make my first post on stackoverflow.
After doing loads of research (plugins, performance, indexes, types of update, friends) and after trying several approaches I was unable to find a proper answer/solution.
So if possible I would like to get your feedback/help in a Microsoft Dynamics CRM 2013/2015 plugin performance issue (or coding technique)
Scenario:
Microsoft Dynamics CRM 2013/2015
2 Entities with Relationship 1:N
EntityA
EntityB
EntityB has the following columns:
Id | EntityAId | ColumnDemoX (decimal) | ColumnDemoY (currency)
Entity A has: 500 records
Entity B has: 150 records per each Entity A record. So 500*150 = 75000 records.
Objective:
Create a Post Entity A Plugin Update to "mimic" the following SQL command
Update EntityB
Set ColumnDemoX = (some quantity), ColumnDemoY = (some quantity) * (some value)
Where EntityAId = (some id)
One approach could be:
using (var serviceContext = new XrmServiceContext(service))
{
var query = from a in serviceContext.EntityASet
where a.EntityAId.Equals(someId)
select a;
foreach (EntityA entA in query)
{
entA.ColumnDemoX = (some quantity);
serviceContext.UpdateObject(entA);
}
serviceContext.SaveChanges();
}
Problem:
The foreach for 150 records in the post plugin update will take 20 secs or more.
While the
Update EntityB Set ColumnDemoX = (some quantity), ColumnDemoY = (some quantity) * (some value) Where EntityAId = (some id)
it will take 0.00001 secs
Any suggestion/solution?
Thank you all for reading.
H
You can use the ExecuteMultipleRequest, when you iterate the 150 entities, save the entities you need to update and after that call the request. If you do this, you only call the service once, that's very good for the perfomance.
If your process could be bigger and bigger, then you should think making it asynchronous as a plug-in or a custom activity workflow.
This is an example:
// Create an ExecuteMultipleRequest object.
requestWithResults = new ExecuteMultipleRequest()
{
// Assign settings that define execution behavior: continue on error, return responses.
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = false,
ReturnResponses = true
},
// Create an empty organization request collection.
Requests = new OrganizationRequestCollection()
};
// Add a UpdateRequest for each entity to the request collection.
foreach (var entity in input.Entities)
{
UpdateRequest updateRequest = new UpdateRequest { Target = entity };
requestWithResults.Requests.Add(updateRequest);
}
// Execute all the requests in the request collection using a single web method call.
ExecuteMultipleResponse responseWithResults =
(ExecuteMultipleResponse)_serviceProxy.Execute(requestWithResults);
Few solutions comes to mind but I don't think they will please you...
Is this really a problem ? Yes it's slow and database update can be so much faster. However if you can have it as a background process (asynchronous), you'll have your numbers anyway. Is it really a "I need this numbers in the next second as soon as I click or business will go down" situation ?
It can be a reason to ditch 2013. In CRM 2015 you can use a calculated field. If you need this numbers only to show up in forms (eg. you don't use them in reporting), you could also do it in javascript.
Warning this is for the desesperate call. If you really need your update to be synchronous, immediate, you can't use calculated fields, you really know what your doing etc... Why not do it directly in the database? I know this is a very bad advice. There are a lot of reason not to do it this way (you can read a few here). It's unsupported and if you do something wrong it could go really bad. But if your real situation is as simple as your example (just a calculated field, no entity creation, no relation modification), you could do it this way. You'll have to consider many things: you won't have any audit on the fields, no security, caching issues, no modified by, etc. Actually I pretty much advise against this solution.
1 - Put it this logic to async workflow.
OR
2 - Don't use
serviceContext.UpdateObject(entA);
serviceContext.SaveChanges();.
Get all the records (150) from post stage update the fields and ExecuteMultipleRequest to update crm records in one time.
Don't send update request for each and every record

Get Crystal Report data in session

I have noticed that crystal report runs the Linq query once again when the page index is changed, means when we load second page from first page?
So just wanted to know if we can get which page is loaded so that we can keep values in session.
Just a hint is required as I am not getting the desired results from Google.
Update:
I am sorry in a hurry I just clicked on a wrong tag.
So the problem is like:
This is my code below which I use fr running my crystal report:
var rpt = new Result();
List<class> lst1 = new DALMethod().Get();
rpt.SetDataSource(lst1);
CRReportViewer.ReportSource = rpt;
When I switch from page one to two or more, this method in DAL is called again taking the same time it took first time to load, so I just want to have the data in session when query runs first time, and next time when I get the page index, then I will show data from session.
Is there a way around by which I can get the page index in this c# code?
I had found the solution, hope this might help someone else:
I was using a generic list as a data source:
As soon as we get to know the page loads for the first time, I mean not a postback, we can initialize a list to be maintained in session.
After showing the report we can add the data source (which is a list type).
On Report page shift data will be taken from session.
if (!IsPostBack)
{
//clear session and create new session
Session["ReportGenericList"] = null;
}
List<class> datasourceLst=null;
if (Session["ReportGenericList"] != null)
{
datasourceLst= (List<class>)Session["ReportGenericList"];
}
else
{
datasourceLst = //call methods to fill datasource
Session["ReportGenericList"] = datasourceLst;
}

jqgrid randId() produces duplicates after page reload

On my grid, after a user enters text on the bottom row, I am adding another row so they can fill out another row if needed. The grid will grow as needed by the user. This is working fine, however after a page reload and populating from db, the addrowdata() function does not honor existing row ids and creates duplicates, starting from 1 again, e.g. jqg1. It should look at existing row ids and create new unique ids. So if I have 5 rows already, it might start at jqg6. Here is the relevant code inside onCellSelect:
var records = jQuery("#table-1").jqGrid('getGridParam', 'records');
var lastRowId = jQuery("#table-1").jqGrid('getDataIDs')[records - 1];
if (lastRowId == id)
{
jQuery('#table-1').addRowData(undefined, {}, 'last');
}
I have also tried $.jgrid.randId() instead of undefined, same results as expected.
Thanks
Ryan
I think that the error is in the part where you fill grid with the data from the database. The data saved in the database has unique ids. The ids are not in the form jqg1, jqg2, ... So if should be no conflicts. You should just fill the id fields of the JSON with the ids from the database.
One more possibility is that you just specify the rowid parameter (the first parameter) of addRowData yourself. In the case you will have full control on the new ids of the rows added in the grid.
The code of $.jgrid.randId function is very easy. There are $.jgrid.uidPref initialized as 'jqg' and $.jgrid.guid initialized to 1. The $.jgrid.randId function do the following
$.jgrid.randId = function (prefix) {
return (prefix? prefix: $.jgrid.uidPref) + ($.jgrid.guid++);
}
If it is really required you can increase (but not decrease) the $.jgrid.guid value without any negative side effects.

EF4 Import/Lookup thousands of records - my performance stinks!

I'm trying to setup something for a movie store website (using ASP.NET, EF4, SQL Server 2008), and in my scenario, I want to allow a "Member" store to import their catalog of movies stored in a text file containing ActorName, MovieTitle, and CatalogNumber as follows:
Actor, Movie, CatalogNumber
John Wayne, True Grit, 4577-12 (repeated for each record)
This data will be used to lookup an actor and movie, and create a "MemberMovie" record, and my import speed is terrible if I import more than 100 or so records using these tables:
Actor Table: Fields = {ID, Name, etc.}
Movie Table: Fields = {ID, Title, ActorID, etc.}
MemberMovie Table: Fields = {ID, CatalogNumber, MovieID, etc.}
My methodology to import data into the MemberMovie table from a text file is as follows (after the file has been uploaded successfully):
Create a context.
For each line in the file, lookup the artist in the Actor table.
For each Movie in the Artist table, lookup the matching title.
If a matching Movie is found, add a new MemberMovie record to the context and call ctx.SaveChanges().
The performance of my implementation is terrible. My expectation is that this can be done with thousands of records in a few seconds (after the file has been uploaded), and I've got something that times out the browser.
My question is this: What is the best approach for performing bulk lookups/inserts like this? Should I call SaveChanges only once rather than for each newly created MemberMovie? Would it be better to implement this using something like a stored procedure?
A snippet of my loop is roughly this (edited for brevity):
while ((fline = file.ReadLine()) != null)
{
string [] token = fline.Split(separator);
string Actor = token[0];
string Movie = token[1];
string CatNumber = token[2];
Actor found_actor = ctx.Actors.Where(a => a.Name.Equals(actor)).FirstOrDefault();
if (found_actor == null)
continue;
Movie found_movie = found_actor.Movies.Where( s => s.Title.Equals(title, StringComparison.CurrentCultureIgnoreCase)).FirstOrDefault();
if (found_movie == null)
continue;
ctx.MemberMovies.AddObject(new MemberMovie()
{
MemberProfileID = profile_id,
CatalogNumber = CatNumber,
Movie = found_movie
});
try
{
ctx.SaveChanges();
}
catch
{
}
}
Any help is appreciated!
Thanks, Dennis
First:
Some time ago I wrote an answer about calling SaveChanges after 1, n or all rows:
When should I call SaveChanges() when creating 1000's of Entity Framework objects? (like during an import)
It is actually better to call SaveChanges after more than 1 row, but not after all.
Second:
Make sure you have index on name in Actors table and title in Movies, that should help. Also you shouldn't select whole Actor, if you need only his ID:
Instead of:
Actor found_actor = ctx.Actors.Where(a => a.Name.Equals(actor)).FirstOrDefault();
you can select:
int? found_actor_id = ctx.Actors.Where(a => a.Name.Equals(actor)).Select(a => a.ID).FirstOrDefault();
and then
Something.ActorID = found_actor_id;
This can be faster, because doesn't require whole Actor entity and doesn't require additional lookups, specially when combined with index.
Third:
If you send a very large file, there is still probability of timeout, even with good performance. You should run this import in separate thread and return response immediately. You can give some kind of identifier to every import and allow user to check status by this ID.

Resources