How many data can save in one Parse.Object.saveAll request? And how many number of request will be used for one Parse.Object.saveAll - parse-platform

Recently, I have some test on parse.com. I am now facing a problem of using Parse.Object.saveAll in background job.
From the document in parse.com, it says that a background job can run for 15 minutes. I am now setting a background job to pour the data in the database using the following code:
Parse.Cloud.job("createData", function(request, status) {
var Dummy = Parse.Object.extend("dummy");
var batchSaveArr = [];
for(var i = 0 ; i < 50000 ; i ++){
var obj = new Dummy();
// genMessage() is a function to generate a random string with 5 characters long
obj.set("message", genMessage());
obj.set("numValue",Math.floor(Math.random() * 1000));
batchSaveArr.push(obj);
}
Parse.Object.saveAll(batchSaveArr, {
success: function(list){
status.success("success");
},
error: function(error){
status.error(error.message);
}
});
});
Although it is used to pour data into database, the main purpose is to test the function Parse.Object.saveAll. When I run this job, an error "This application has exceeded its request limit." is appeared in the log. However, when I see the analysis page, it show me that the request count is less than or equal to 1. I only run this job in Parse, and no other request is made during the background is running.
It seems that there is some problem on Parse.Object.saveAll. Or maybe I have some misunderstanding on this function.
Are there anyone facing the same problem?
How many data can save in one Parse.Object.saveAll request?
How many number of request will be used for one Parse.Object.saveAll

I have asked the question in Facebook and the reply is quite disappointed.
Please follow the link:
https://developers.facebook.com/bugs/439821726158766/

Related

Web.Content calling API service and merging pages with List.Transform started to fail

I created PowerBI report which which is connecting to data source via API service. Returning json contains thousands of entities. API service is called via Web.Content function. API service returns always total record count and so we are able to calculate nr. of pages which has to be called to obtain whole dataset. This report is displaying data from our servicedesk app, which is deployed on many servers and for many customers and use Query parameters to connect to any of these servers.
Detail of Power query is below.
Why am I writing here. This report was working without any issue more than 1,5 year but on August 17th one of servers start causing erros in step Pages where are some random lines (pages) with errors - see attached picture labeled "Errors in step Pages". and this is reason that next step Entities (List.Union) in query is stopping refresh and generate errors with message:
Expression.Error: We cannot apply field access to the type List. Details: Value=[List] Key=requests
What is notable
API service si returning records in the same order but faulty lists are random when calling with same parameters
some times is refresh without any error
The same power query called on another server is working correctly , problem is only with one specific server.
This problem started without notice on the most important server after 1,5 year without any problem.
Here is full text power of query for this main source, which is used later in other queries to extract all necessary data. Json is really complicated and I extract from it list of requests, list of solvers, list of solver groups,.... and this base query and its output is input for many referenced queries.
Errors in step Pages
let
BaseAPIUrl = apiurl&"apiservice?", /*apiurl is parameter - name of server e.g. https://xxxx.xxxxxx.sk/ */
EntitiesPerPage = RecordsPerPage, /*RecordsPerPage is parameter and defines nr. of record per page - we used as optimum 200-400 record per pages, but is working also with 4000 record per page*/
ApiToken = FnApiToken(), /*this function is returning apitoken value which is returning value of another api service apiurl&"api/auth/login", which use username and password in body of call to get apitoken */
GetJson = (QParm) => /*definiton general function to get data from data source*/
let
Options =
[ Query= QParm,
Headers=
[
Accept="application/json",
ApiKeyName="apitoken",
Authorization=ApiToken
]
],
RawData = Web.Contents(BaseAPIUrl, Options),
Json = Json.Document(RawData)
in Json,
GetEntityCount = () => /*one times called function to get nr of records using GetJson, which is returned as a part of each call*/
let
QParm = [pp="1", pg="1" ],
Json = GetJson(QParm),
Count = Json[totalRecord]
in
Count,
GetPage = (Index) => /*repeatadly called function to get each page of json using GetJson*/
let
PageNr = Text.From(Index+1),
PerPage = Text.From(EntitiesPerPage),
QParm = [pg = PageNr, pp=PerPage],
Json = GetJson(QParm),
Value = Json[data][requests]
in Value,
EntityCount = List.Max({ EntitiesPerPage, GetEntityCount() }), /*setup of nr. of records to variable*/
PageCount = Number.RoundUp(EntityCount / EntitiesPerPage), /*setup of nr. of pages */
PageIndices = { 0 .. PageCount - 1 },
Pages = List.Transform(PageIndices, each GetPage(_) /*Function.InvokeAfter(()=>GetPage(_),#duration(0,0,0,1))*/), /*here we call for each page GetJson function to get whole dataset - there is in comment test with delay between getpages but was not neccessary*/
Entities = List.Union(Pages),
Table = Table.FromList(Entities, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
I also tried another way of appending pages to list using List.Generate. This is also bringing random errors in list but
it is bringing possibility to transform to table in contrast with original way with using List.Transform, but other referenced queries are failing and contains on the last row errors
When I am exploring content of faulty page/list extracting it via Add as New Query there are always all record without any fail.....
Source = List.Generate( /*another way to generate list of all pages*/
() => [Page = 0, ReqPageData = GetPage(0) ],
each [Page] < PageCount,
each [ReqPageData = GetPage( [Page] ),
Page = [Page] + 1 ],
each [ReqPageData]
),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error), /*here i am able to generate table from list in contrast when is used List.Generate*/
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"), /*here aj can expand list to column*/
#"Removed Errors" = Table.RemoveRowsWithErrors(#"Expanded Column1", {"Column1"}) /*here i try to exclude errors, but i dont know what happend and which records (if any) are excluded*/
Extracting errored page
and finnaly I am tottaly clueless not able to find the cause of this behavior on this specific server. I tested to call pages which are errored via POSTMAN, I discused this issue with author of API service and He also tried to call this API service with all parameters but server is returning every page OK, only Power query is not able to List.Transform ...
I will be grateful and appreciate any tips or advice or if somebody solved the same issue in the past ....
Kuby
No, each error line of list in step List.Transform coud by extracted as new query and there are all records from one page OK. hmmmm
Finnaly, problem described in this issue was caused by "corrupted" content of returning json. The provider of core system informed me that they found bug and after fixing on the side of servisdesk is everything OK again. I tried to find problem in Power query and problem was in servisdesk. :(

How can I receive real-time updates from a long asynchronous process?

I'm writing a small, internal web application that reads in form data and creates an excel file which then gets emailed to the user.
However, I'm struggling to understand how I can implement real-time updates for the user as the process is being completed. Sometimes the process takes 10 seconds, and sometimes the process takes 5 minutes.
Currently the user waits until the process is complete before they see any results - They do not see any updates as the process is being completed. The front-end waits for a 201 response from the server before displaying the report information and the user is "blocked" until the RC is complete.
I'm having difficulty understanding how I can asynchronously start the Report Creation (RC) process and at the same time allow the user to navigate to other pages of the site. or see updates happening in the background. I should clarify here that the some of the steps in the RC process use Promises.
I'd like to poll the server every second to get an update on the report being generated.
Here's some simple code to clarify my understanding:
Endpoints
// CREATE REPORT
router.route('/report')
.post(function(req, res, next) {
// Generate unique ID to keep track of report later on.
const uid = generateRandomID();
// Start report process ... this should keep executing even after a response (201) is returned.
CustomReportLibrary.createNewReport(req.formData, uid);
// Respond with a successful creation.
res.status(201);
}
}
// GET REPORT
router.route('/report/:id')
.get(function(req, res, next){
// Get our report from ID.
let report = CustomReportLibrary.getReport(req.params.id);
// Respond with report data
if(report) { res.status(200).json(report); }
else { res.status(404); }
}
CustomReportLibrary
// Initialize array to hold reports
let _dataStorage = [];
function createNewReport(data, id) {
// Create an object to store our report information
let reportObject = {
id: id,
status: 'Report has started the process',
data: data
}
// Add new report to global array.
_dataStorage.push(reportObject);
// ... continue with report generation. Assume this takes 5 minutes.
// ...
// ... update _dataStorage[length-1].status after each step
// ...
// ... finish generation.
}
function getReport(id) {
// Iterate through array until report with matching ID is found.
// Return report if match is found.
// Return null if no match is found.
}
From my understanding, CustomerReportLibrary.createNewReport() will execute in the background even after a 201 response is returned. In the front-end, I'd make an AJAX call to /report/:id on an interval basis to get updates on my report. Is this the right way to do this? Is there a better way to do this?
I think you are on the right way. HTTP 202 (The request has been accepted for processing, but the processing has not been completed) is a proper way to handle your case.
It can be done like this:
client sends POST /reports, server starts creating new report and returns:
202 Accepted
Location: http://api.domain.com/reports/1
client issues GET /reports/1 to get status of the report
All the above flow is async, so users are not blocked.

Parse cloud job set() function weirdness

I'm trying to run this cloud job weekly on parse where I assign a rank to players based on their high scores. This piece of code mostly seems to work except it only sets ranks from 1 to 9. Anything with more than one digit does not get set!
The job returns a success after setting ranks from 1-9.
Parse.Cloud.job("TestJob", function(request, status)
{
Parse.Cloud.useMasterKey();
var rank = 0;
var usersQuery = new Parse.Query("ECJUser").descending("HighScore");
usersQuery.find(function(results){
for(var i=0;i<results.length;++i)
{
rank += 1;
console.log("Setting "+results[i].get('Name')+" rank to "+rank);
results[i].save({"Rank": rank});
}
}).then(function(){
status.success("Weekly Ranks Assigned");
}, function(error){
status.error("Uh oh. Weekly ranking failed");
})
})
In the console log, it clearly says "setting playerName rank to 11", but it doesn't actually set anything in the parse table. Just undefined (or what ever it was previously).
Does the code look right? Something javascript related that I'm missing?
Updated based on answers:
Apparently I'm not waiting for the jobs to complete. But I'm not sure how to write code for handling promises. Here's what I have:
var usersQuery = new Parse.Query("ECJUser").descending("HighScore");
usersQuery.find().then(function(results)
{
var promises = [];
for(var i=0;i<results.length;i++)
{
promises.append(results[i].save({"Rank":rank}));
}
return promises;
})
What do I do with the list of promises? where do I wait for them to complete?
Your code does not wait for saves to complete so it's going to have unpredictable results. It also isn't going to run through all users, just the first 'page' returned by the query.
So, instead of using find you should consider using each. You also need to consider wether the job will have time to process all users and may need to run multiple times.
For the save you should add each promise that is returned to an array and then wait for all of the promises to complete before calling status.success.

Parse.User query not working in Cloud Code

I am working on a project using Parse where I need some information calculated for each user and updated when they update their account. I created a Cloud Code trigger that does what I need whenever a user account is updated, and that is working well. However, I have about two thousand accounts that are already created that I need to update as well. After hours of trying to get a Cloud Job to work, I decided to try to simplify it. I wrote the following job to simply count the user accounts. To reiterate; I'm not actually trying to count the users, there are much more efficient ways to do that, I am trying to verify that I can query and loop over the existing user accounts. (The option to useMasterKey is in there because I will need that later.)
Parse.Cloud.job("getUserStatistics", function(request, status) {
// Set up to modify user data
Parse.Cloud.useMasterKey();
// Query for all users
var query = new Parse.Query(Parse.User);
var counter = 0;
query.each(function(user) {
counter = counter+1;
}).then(function() {
// Set the job's success status
status.success("Counted all User Accounts.");
}, function(error) {
// Set the job's error status
status.error("Failed to Count User Accounts.");
});
console.log('Found '+counter+' users.');
});
When I run the code, I get:
I2015-07-09T17:29:10.880Z]Found 0 users.
I2015-07-09T17:29:12.863Z]v99: Ran job getUserStatistics with:
Input: "{}"
Result: Counted all User Accounts.
Even more baffling to me, if I add:
query.limit(10);
...the query itself actually fails! (I would expect it to count 10 users.)
That said, if there is a simpler way to trigger an update on all the users in a Parse application, I'd love to hear it!
The reference actually says that:
The query may not have any sort order, and may not use limit or skip.
https://parse.com/docs/js/api/symbols/Parse.Query.html#each
So forget about "query.limit(10)", that's not relevant here.
Anyways, by their example for a background job, it seems you might have forgotten to put return in your "each" function. Plus, you called console.log('Found '+counter+' users.'); out side of your asynchronous task, that makes sense why you get 0 results. maybe you try:
query.each(function(user) {
counter = counter+1;
// you'll want to save your changes for each user,
// therefore, you will need this
return user.save();
}).then(function() {
// Set the job's success status
status.success("Counted all User Accounts.");
// console.log inside the asynchronous scope
console.log('Found '+counter+' users.');
}, function(error) {
// Set the job's error status
status.error("Failed to Count User Accounts.");
});
You can check again Parse's example of writing this cloud job.
https://parse.com/docs/js/guide#cloud-code-advanced-writing-a-background-job

How to make script execution slow?

I have the task: need to select data from "TABLE_FROM", modify it and insert to the "TABLE_TO". The main problem is script must run on production and shouldn't hurts live site performance, but "TABLE_FROM" contains hundred millions of rows. Going to run the script using nodejs. What techniques are using to resolve such kind of problems? ie. how to make this script running "slowly" or other words "softly" to prevent DB and CPU overload?
Time of script execution is irrelevant. I use Cassandra DB.
Sample code:
var OFFSET = 0;
var BATCHSIZE = 100;
var TIMEOUT = 1000;
function fetchPush() {
// fetch from TABLE_FROM, possibly in batches
rows = fetch(OFFSET, BATCHSIZE);
// push to TABLE_TO
push(rows);
// do next batch in timeout
setTimeout(fetchPush, TIMEOUT);
}
Here I'm assuming the fetch and push are blocking calls, for async processing you could (obviously) use async.

Resources