I have aprox 520 classrooms archived in my account, if I try to select them with
var courseList = Classroom.Courses.list({"courseStates":["ARCHIVED"]}).courses;
I get only 300 of them. Is this normal?
How can I select them all? Actually I'm writing a script to delete the oldest, but if I can't retrieve them, I can't delete them.
I understand that you got so many courses that the Courses.list() response is splitted in separate pages. In that case you can very easily navigate them by using tokens. First of all, make sure that you specify the pageSize in your request. That would set the desired amount of responses per page. Please keep in mind that the server may return fewer than the specified number of results, as it declared on the docs. In case that your response got divided into pages, the response would include the nextPageToken field. Then, to obtain the rest of courses, you have to repeat your request including that nextPageToken into the pageToken property. Please don't hesitate to ask me any doubt about this approach.
Thanks a lot Jaques, I found the solution:
var parametri = {"courseStates": "ARCHIVED"};
var page = Classroom.Courses.list(parametri);
var listaClassi = page.courses;
if (page.nextPageToken !== '') {
parametri.pageToken = page.nextPageToken;
page = Classroom.Courses.list(parametri);
listaClassi = listaClassi.concat(page.courses);
}
Anyway, I didn't need to change the pageSize, nor I found any tutorial about it.
Related
I created PowerBI report which which is connecting to data source via API service. Returning json contains thousands of entities. API service is called via Web.Content function. API service returns always total record count and so we are able to calculate nr. of pages which has to be called to obtain whole dataset. This report is displaying data from our servicedesk app, which is deployed on many servers and for many customers and use Query parameters to connect to any of these servers.
Detail of Power query is below.
Why am I writing here. This report was working without any issue more than 1,5 year but on August 17th one of servers start causing erros in step Pages where are some random lines (pages) with errors - see attached picture labeled "Errors in step Pages". and this is reason that next step Entities (List.Union) in query is stopping refresh and generate errors with message:
Expression.Error: We cannot apply field access to the type List. Details: Value=[List] Key=requests
What is notable
API service si returning records in the same order but faulty lists are random when calling with same parameters
some times is refresh without any error
The same power query called on another server is working correctly , problem is only with one specific server.
This problem started without notice on the most important server after 1,5 year without any problem.
Here is full text power of query for this main source, which is used later in other queries to extract all necessary data. Json is really complicated and I extract from it list of requests, list of solvers, list of solver groups,.... and this base query and its output is input for many referenced queries.
Errors in step Pages
let
BaseAPIUrl = apiurl&"apiservice?", /*apiurl is parameter - name of server e.g. https://xxxx.xxxxxx.sk/ */
EntitiesPerPage = RecordsPerPage, /*RecordsPerPage is parameter and defines nr. of record per page - we used as optimum 200-400 record per pages, but is working also with 4000 record per page*/
ApiToken = FnApiToken(), /*this function is returning apitoken value which is returning value of another api service apiurl&"api/auth/login", which use username and password in body of call to get apitoken */
GetJson = (QParm) => /*definiton general function to get data from data source*/
let
Options =
[ Query= QParm,
Headers=
[
Accept="application/json",
ApiKeyName="apitoken",
Authorization=ApiToken
]
],
RawData = Web.Contents(BaseAPIUrl, Options),
Json = Json.Document(RawData)
in Json,
GetEntityCount = () => /*one times called function to get nr of records using GetJson, which is returned as a part of each call*/
let
QParm = [pp="1", pg="1" ],
Json = GetJson(QParm),
Count = Json[totalRecord]
in
Count,
GetPage = (Index) => /*repeatadly called function to get each page of json using GetJson*/
let
PageNr = Text.From(Index+1),
PerPage = Text.From(EntitiesPerPage),
QParm = [pg = PageNr, pp=PerPage],
Json = GetJson(QParm),
Value = Json[data][requests]
in Value,
EntityCount = List.Max({ EntitiesPerPage, GetEntityCount() }), /*setup of nr. of records to variable*/
PageCount = Number.RoundUp(EntityCount / EntitiesPerPage), /*setup of nr. of pages */
PageIndices = { 0 .. PageCount - 1 },
Pages = List.Transform(PageIndices, each GetPage(_) /*Function.InvokeAfter(()=>GetPage(_),#duration(0,0,0,1))*/), /*here we call for each page GetJson function to get whole dataset - there is in comment test with delay between getpages but was not neccessary*/
Entities = List.Union(Pages),
Table = Table.FromList(Entities, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
I also tried another way of appending pages to list using List.Generate. This is also bringing random errors in list but
it is bringing possibility to transform to table in contrast with original way with using List.Transform, but other referenced queries are failing and contains on the last row errors
When I am exploring content of faulty page/list extracting it via Add as New Query there are always all record without any fail.....
Source = List.Generate( /*another way to generate list of all pages*/
() => [Page = 0, ReqPageData = GetPage(0) ],
each [Page] < PageCount,
each [ReqPageData = GetPage( [Page] ),
Page = [Page] + 1 ],
each [ReqPageData]
),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error), /*here i am able to generate table from list in contrast when is used List.Generate*/
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"), /*here aj can expand list to column*/
#"Removed Errors" = Table.RemoveRowsWithErrors(#"Expanded Column1", {"Column1"}) /*here i try to exclude errors, but i dont know what happend and which records (if any) are excluded*/
Extracting errored page
and finnaly I am tottaly clueless not able to find the cause of this behavior on this specific server. I tested to call pages which are errored via POSTMAN, I discused this issue with author of API service and He also tried to call this API service with all parameters but server is returning every page OK, only Power query is not able to List.Transform ...
I will be grateful and appreciate any tips or advice or if somebody solved the same issue in the past ....
Kuby
No, each error line of list in step List.Transform coud by extracted as new query and there are all records from one page OK. hmmmm
Finnaly, problem described in this issue was caused by "corrupted" content of returning json. The provider of core system informed me that they found bug and after fixing on the side of servisdesk is everything OK again. I tried to find problem in Power query and problem was in servisdesk. :(
I'm working on my first laravel project: a family tree. I have 4 branches of the family, each with people/families/images/stories/etc. A given user on the website will have access to everything for 1, 2, or 4 of these branches of the family (I don't want to show a cousin stuff for people they're not related to).
So on various pages I want the collections from the controller to contain stuff based on the given user's permissions. Merge seems like the right way to do this.
I have scopes to get people from each branch of the family, and in the following example I also have a scope for people with a birthday this month. In order to show the right set of birthdays for this user, I can get this by merging each group individually if they have access.
Here's what my function would look like if I showed everyone in all 4 family branches:
public function get_birthday_people()
{
$user = \Auth::user();
$jones_birthdays = Person::birthdays()->jones()->get();
$smith_birthdays = Person::birthdays()->smith()->get();
$lee_birthdays = Person::birthdays()->lee()->get();
$brandt_birthdays = Person::birthdays()->brandt()->get();
$birthday_people = $jones_birthdays
->merge($smith_birthdays)
->merge($lee_birthdays )
->merge($brandt_birthdays );
return $birthday_people;
My challenge: I'd like to modify it so that I check the user's access and only add each group of people accordingly. I'm imagining something where it's all the same as above except I add conditionals like this:
if($user->jones_access) {
$jones_birthdays = Person::birthdays()->jones()->get();
}
else{
$jones_birthdays =NULL;
}
But that throws an error for users without access because I can't call merge on NULL (or an empty array, or the other versions of 'nothing' that I tried).
What's a good way to do something like this?
if($user->jones_access) {
$jones_birthdays = Person::birthdays()->jones()->get();
}
else{
$jones_birthdays = new Collection;
}
Better yet, do the merge in the condition, no else required.
$birthday_people = new Collection;
if($user->jones_access) {
$birthday_people->merge(Person::birthdays()->jones()->get());
}
You are going to want your Eloquent query to only return the relevant data for the user requesting it. It doesn't make sense to query Lee birthdays when a Jones person is accessing that page.
So what you will wind up doing is something like
$birthdays = App\Person::where('family', $user->family)->get();
This pulls in Persons where their family property is equal to the family of the current user.
This probably does not match the way you have your relationships right now, but hopefully it will get you on the right track to getting them sorted out.
If you really want to go ahead with a bunch of queries and checking for authorization, read up on the authorization features of Laravel. It will give let you assign abilities to users and check them easily.
Time to leave the shy mode behind and make my first post on stackoverflow.
After doing loads of research (plugins, performance, indexes, types of update, friends) and after trying several approaches I was unable to find a proper answer/solution.
So if possible I would like to get your feedback/help in a Microsoft Dynamics CRM 2013/2015 plugin performance issue (or coding technique)
Scenario:
Microsoft Dynamics CRM 2013/2015
2 Entities with Relationship 1:N
EntityA
EntityB
EntityB has the following columns:
Id | EntityAId | ColumnDemoX (decimal) | ColumnDemoY (currency)
Entity A has: 500 records
Entity B has: 150 records per each Entity A record. So 500*150 = 75000 records.
Objective:
Create a Post Entity A Plugin Update to "mimic" the following SQL command
Update EntityB
Set ColumnDemoX = (some quantity), ColumnDemoY = (some quantity) * (some value)
Where EntityAId = (some id)
One approach could be:
using (var serviceContext = new XrmServiceContext(service))
{
var query = from a in serviceContext.EntityASet
where a.EntityAId.Equals(someId)
select a;
foreach (EntityA entA in query)
{
entA.ColumnDemoX = (some quantity);
serviceContext.UpdateObject(entA);
}
serviceContext.SaveChanges();
}
Problem:
The foreach for 150 records in the post plugin update will take 20 secs or more.
While the
Update EntityB Set ColumnDemoX = (some quantity), ColumnDemoY = (some quantity) * (some value) Where EntityAId = (some id)
it will take 0.00001 secs
Any suggestion/solution?
Thank you all for reading.
H
You can use the ExecuteMultipleRequest, when you iterate the 150 entities, save the entities you need to update and after that call the request. If you do this, you only call the service once, that's very good for the perfomance.
If your process could be bigger and bigger, then you should think making it asynchronous as a plug-in or a custom activity workflow.
This is an example:
// Create an ExecuteMultipleRequest object.
requestWithResults = new ExecuteMultipleRequest()
{
// Assign settings that define execution behavior: continue on error, return responses.
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = false,
ReturnResponses = true
},
// Create an empty organization request collection.
Requests = new OrganizationRequestCollection()
};
// Add a UpdateRequest for each entity to the request collection.
foreach (var entity in input.Entities)
{
UpdateRequest updateRequest = new UpdateRequest { Target = entity };
requestWithResults.Requests.Add(updateRequest);
}
// Execute all the requests in the request collection using a single web method call.
ExecuteMultipleResponse responseWithResults =
(ExecuteMultipleResponse)_serviceProxy.Execute(requestWithResults);
Few solutions comes to mind but I don't think they will please you...
Is this really a problem ? Yes it's slow and database update can be so much faster. However if you can have it as a background process (asynchronous), you'll have your numbers anyway. Is it really a "I need this numbers in the next second as soon as I click or business will go down" situation ?
It can be a reason to ditch 2013. In CRM 2015 you can use a calculated field. If you need this numbers only to show up in forms (eg. you don't use them in reporting), you could also do it in javascript.
Warning this is for the desesperate call. If you really need your update to be synchronous, immediate, you can't use calculated fields, you really know what your doing etc... Why not do it directly in the database? I know this is a very bad advice. There are a lot of reason not to do it this way (you can read a few here). It's unsupported and if you do something wrong it could go really bad. But if your real situation is as simple as your example (just a calculated field, no entity creation, no relation modification), you could do it this way. You'll have to consider many things: you won't have any audit on the fields, no security, caching issues, no modified by, etc. Actually I pretty much advise against this solution.
1 - Put it this logic to async workflow.
OR
2 - Don't use
serviceContext.UpdateObject(entA);
serviceContext.SaveChanges();.
Get all the records (150) from post stage update the fields and ExecuteMultipleRequest to update crm records in one time.
Don't send update request for each and every record
Just want to get some ideas from anyone who have encountered similar problems and how did you guys come up with the solution.
Basically, we have around 10K documents stored in RavenDB. And we need the ability to allow users to perform filter and search against those documents. I am aware that there is a maximum of 1024 page size within RavenDb. So in order for the filter and search to work, I need to do my own paging. But my solution gives me the following error:
The maximum number of requests (30) allowed for this session has been reached.
I have tried many different ways of disposing the session by wrapping it around using keyword and also explicitly calling Dispose after every call to RavenDb with no success.
Does anyone know how to get around this issue? what's the best practice for this kind of scenario?
var pageSize = 1024;
var skipSize = 0;
var maxSize = 0;
using (_documentSession)
{
maxSize = _documentSession.Query<LogEvent>().Count();
}
while (skipSize < maxSize)
{
using (_documentSession)
{
var events = _documentSession.Query<LogEvent>().Skip(skipSize).Take(pageSize).ToList();
_documentSession.Dispose();
//building finalPredicate codes..... which i am not providing here....
results.AddRange(events.Where(finalPredicate.Compile()).ToList());
skipSize += pageSize;
}
}
Raven limits the number of Request (Load, Query, ...) to 30 per Session. This behavior is documented.
I can see that you dispose the session in your code. But I don't see where you recreating the session. Anyways loading data they way you intend to do is not a good idea.
We're using indexes and paging and never load more than 1024.
If you're expecting thousands of documents or your precicate logic doesn't work as an index and you don't care about how long your query will take use the unbounded results API.
var results = new List<LogEvent>();
var query = session.Query<LogEvent>();
using (var enumerator = session.Advanced.Stream(query))
{
while (enumerator.MoveNext())
{
if (predicate(enumerator.Current.Document)) {
results.Add(enumerator.Current.Document);
}
}
}
Depending on the amount of document this will use a lot of RAM.
Hope that someone out there can help with this!
I'll give an example based on the standard Order-->OrderLine-->Product rather than the actual situation to make it easier to explain!
Basically, I want to run a query that returns all orders for which there is an order line containing a TV. Simple enough:
IEnumerable<Order> orders;
using (var context = new DataContext())
{
var source =
context.Orders.Include("OrderLines").Include(
"OrderLines.Product");
orders= source.Where(o => o.OrderLines.Where(ol => ol.Product.Name == "TV")).ToList();
}
return orders;
This works in the sense that I get the correct collection of Order entities, but when I use look at each Order's collection of OrderLines it contains all OrderLines not just those containing at TV.
Hope that makes sense.
Thanks in advance for any help.
I does make sense in that the query is fulfilling your original criteria "to return all orders for which there is an order line containing a TV", each order will of course have all the orderlines. The filter is only being used to select the Orders, not the OrderLines.
To retrieve just the OrderLines containing TV from an Order you'd use the filter again, thus:
var OrderLinesWithTV = order.OrderLines.Where(ol => ol.Product.Name == "TV");
The main point is to know if you need to keep (or not) a reference to the order header in the filtered lines.
I.e. do you want the list of all the orders with a TV, and more precisely only their TV lines ? or do you want all the TV lines nevermind their order header ?
You seem to prefer the first option.
Then the best solution would certainly be
var relevantOrders = orders.Where(order => order.OrderLines.Any(ol => ol.Product.Name == "TV"))
to get the relevant orders, and then, for each order in relevantOrders :
order.OrderLines.Where(ol => ol.Product.Name == "TV")
to consider only the TV lines.
Other techniques would result in a loss of information or force you to build a new orders collection similar to the initial one but double-filtered on the headers and on the lines, which seems fairly bad as far as elegance and performance is concerned.