How can I receive real-time updates from a long asynchronous process? - ajax

I'm writing a small, internal web application that reads in form data and creates an excel file which then gets emailed to the user.
However, I'm struggling to understand how I can implement real-time updates for the user as the process is being completed. Sometimes the process takes 10 seconds, and sometimes the process takes 5 minutes.
Currently the user waits until the process is complete before they see any results - They do not see any updates as the process is being completed. The front-end waits for a 201 response from the server before displaying the report information and the user is "blocked" until the RC is complete.
I'm having difficulty understanding how I can asynchronously start the Report Creation (RC) process and at the same time allow the user to navigate to other pages of the site. or see updates happening in the background. I should clarify here that the some of the steps in the RC process use Promises.
I'd like to poll the server every second to get an update on the report being generated.
Here's some simple code to clarify my understanding:
Endpoints
// CREATE REPORT
router.route('/report')
.post(function(req, res, next) {
// Generate unique ID to keep track of report later on.
const uid = generateRandomID();
// Start report process ... this should keep executing even after a response (201) is returned.
CustomReportLibrary.createNewReport(req.formData, uid);
// Respond with a successful creation.
res.status(201);
}
}
// GET REPORT
router.route('/report/:id')
.get(function(req, res, next){
// Get our report from ID.
let report = CustomReportLibrary.getReport(req.params.id);
// Respond with report data
if(report) { res.status(200).json(report); }
else { res.status(404); }
}
CustomReportLibrary
// Initialize array to hold reports
let _dataStorage = [];
function createNewReport(data, id) {
// Create an object to store our report information
let reportObject = {
id: id,
status: 'Report has started the process',
data: data
}
// Add new report to global array.
_dataStorage.push(reportObject);
// ... continue with report generation. Assume this takes 5 minutes.
// ...
// ... update _dataStorage[length-1].status after each step
// ...
// ... finish generation.
}
function getReport(id) {
// Iterate through array until report with matching ID is found.
// Return report if match is found.
// Return null if no match is found.
}
From my understanding, CustomerReportLibrary.createNewReport() will execute in the background even after a 201 response is returned. In the front-end, I'd make an AJAX call to /report/:id on an interval basis to get updates on my report. Is this the right way to do this? Is there a better way to do this?

I think you are on the right way. HTTP 202 (The request has been accepted for processing, but the processing has not been completed) is a proper way to handle your case.
It can be done like this:
client sends POST /reports, server starts creating new report and returns:
202 Accepted
Location: http://api.domain.com/reports/1
client issues GET /reports/1 to get status of the report
All the above flow is async, so users are not blocked.

Related

How can I define a "global" job variable that each processor can read/update using Spring Batch?

I have a Spring Batch job with a reader/processor/writer that reads a batch of EmailQueue records, processes/sends them, and then writes the results (success, fail) back into the EmailQueue database table. However, if during the job I have 5+ emails that fail to send (e.g. because the email API is down), I would like the processor to not attempt the send, but instead, mark the remaining EmailQueue objects as "failed" - and then store back in to the database with the writer. I would like my processor look something like the one below, but I can't figure out how to have a "global" monitor for the job that the processor can access.
It may be important to note that my appUserEmailSender.send(emailQueue) method doesn't throw an error if the email failed to send, it only stores the results in the EmailQueue object itself so I can write the results back into the EmailQueue db table.
public EmailQueue process(#NonNull EmailQueue emailQueue) {
// can this variable be defined globally for each job somewhere???
int emailFailSendCount = 0;
// if fail count less than 5, attempt to send email
if (emailFailSendCount<5) {
// send the email
EmailQueue result = appUserEmailSender.send(emailQueue);
// If failed, increase fail count
if (EmailQueueState.FAILED == result.getEmailQueueState()) {
emailFailSendCount++;
}
// if fail count > 5, don't attempt to send, just mark as "failed"
} else {
emailQueue.setEmailQueueState(EmailQueueState.FAILED);
}
return emailQueue;
}
Clearly the above code wouldn't work, but my question is can I define a "global" emailFailSendCount variable that each process can read or update on each processing step?

Firestore transaction produces console error: FAILED_PRECONDITION: the stored version does not match the required base version

I have written a bit of code that allows a user to upvote / downvote recipes in a manner similar to Reddit.
Each individual vote is stored in a Firestore collection named votes, with a structure like this:
{username,recipeId,value} (where value is either -1 or 1)
The recipes are stored in the recipes collection, with a structure somewhat like this:
{title,username,ingredients,instructions,score}
Each time a user votes on a recipe, I need to record their vote in the votes collection, and update the score on the recipe. I want to do this as an atomic operation using a transaction, so there is no chance the two values can ever become out of sync.
Following is the code I have so far. I am using Angular 6, however I couldn't find any Typescript examples showing how to handle multiple gets() in a single transaction, so I ended up adapting some Promise-based JavaScript code that I found.
The code seems to work, but there is something happening that is concerning. When I click the upvote/downvote buttons in rapid succession, some console errors occasionally appear. These read POST https://firestore.googleapis.com/v1beta1/projects/myprojectname/databases/(default)/documents:commit 400 (). When I look at the actual response from the server, I see this:
{
"error": {
"code": 400,
"message": "the stored version (1534122723779132) does not match the required base version (0)",
"status": "FAILED_PRECONDITION"
}
}
Note that the errors do not appear when I click the buttons slowly.
Should I worry about this error, or is it just a normal result of the transaction retrying? As noted in the Firestore documentation, a "function calling a transaction (transaction function) might run more than once if a concurrent edit affects a document that the transaction reads."
Note that I have tried wrapping try/catch blocks around every single operation below, and there are no errors thrown. I removed them before posting for the sake of making the code easier to follow.
Very interested in hearing any suggestions for improving my code, regardless of whether they're related to the HTTP 400 error.
async vote(username, recipeId, direction) {
let value;
if ( direction == 'up' ) {
value = 1;
}
if ( direction == 'down' ) {
value = -1;
}
// assemble vote object to be recorded in votes collection
const voteObj: Vote = { username: username, recipeId: recipeId , value: value };
// get references to both vote and recipe documents
const voteDocRef = this.afs.doc(`votes/${username}_${recipeId}`).ref;
const recipeDocRef = this.afs.doc('recipes/' + recipeId).ref;
await this.afs.firestore.runTransaction( async t => {
const voteDoc = await t.get(voteDocRef);
const recipeDoc = await t.get(recipeDocRef);
const currentRecipeScore = await recipeDoc.get('score');
if (!voteDoc.exists) {
// This is a new vote, so add it to the votes collection
// and apply its value to the recipe's score
t.set(voteDocRef, voteObj);
t.update(recipeDocRef, { score: (currentRecipeScore + value) });
} else {
const voteData = voteDoc.data();
if ( voteData.value == value ) {
// existing vote is the same as the button that was pressed, so delete
// the vote document and revert the vote from the recipe's score
t.delete(voteDocRef);
t.update(recipeDocRef, { score: (currentRecipeScore - value) });
} else {
// existing vote is the opposite of the one pressed, so update the
// vote doc, then apply it to the recipe's score by doubling it.
// For example, if the current score is 1 and the user reverses their
// +1 vote by pressing -1, we apply -2 so the score will become -1.
t.set(voteDocRef, voteObj);
t.update(recipeDocRef, { score: (currentRecipeScore + (value*2))});
}
}
return Promise.resolve(true);
});
}
According to Firebase developer Nicolas Garnier, "What you are experiencing here is how Transactions work in Firestore: one of the transactions failed to write because the data has changed in the mean time, in this case Firestore re-runs the transaction again, until it succeeds. In the case of multiple Reviews being written at the same time some of them might need to be ran again after the first transaction because the data has changed. This is expected behavior and these errors should be taken more as warnings."
In other words, this is a normal result of the transaction retrying.
I used RxJS throttleTime to prevent the user from flooding the Firestore server with transactions by clicking the upvote/downvote buttons in rapid succession, and that greatly reduced the occurrences of this 400 error. In my app, there's no legitimate reason someone would need to clip upvote/downvote dozens of times per seconds. It's not a video game.

when api call fails on a laravel job

I have a background job that fetches data from google adwords. Now my issue is when I fetch the data using a background worker.
When the response its empty what it the best thing to do is there any way to re run again or what is the best approach ?
public function handle()
{
$googleService = new GoogleAds;
$data = $googleService->report()
->from('CRITERIA_PERFORMANCE_REPORT')
->during('20170101', '20170210')
->select('CampaignId, Id, Criteria, IsNegative, Clicks, Ctr, Cost, Labels')
->getObject();
if(!isset($data->result) || empty($data->result)){
//what to do when no data back ?
}
$this->transform->response($data);
}
You can throw an exception, then it will go back to your queue, and the worker will try to execute it again.
When you launch your worker, there is a --tries parameter that indicates how many time it will try to execute before it goes to the table failed_jobs.
You can check the reference in the official documentation.

Parse.User query not working in Cloud Code

I am working on a project using Parse where I need some information calculated for each user and updated when they update their account. I created a Cloud Code trigger that does what I need whenever a user account is updated, and that is working well. However, I have about two thousand accounts that are already created that I need to update as well. After hours of trying to get a Cloud Job to work, I decided to try to simplify it. I wrote the following job to simply count the user accounts. To reiterate; I'm not actually trying to count the users, there are much more efficient ways to do that, I am trying to verify that I can query and loop over the existing user accounts. (The option to useMasterKey is in there because I will need that later.)
Parse.Cloud.job("getUserStatistics", function(request, status) {
// Set up to modify user data
Parse.Cloud.useMasterKey();
// Query for all users
var query = new Parse.Query(Parse.User);
var counter = 0;
query.each(function(user) {
counter = counter+1;
}).then(function() {
// Set the job's success status
status.success("Counted all User Accounts.");
}, function(error) {
// Set the job's error status
status.error("Failed to Count User Accounts.");
});
console.log('Found '+counter+' users.');
});
When I run the code, I get:
I2015-07-09T17:29:10.880Z]Found 0 users.
I2015-07-09T17:29:12.863Z]v99: Ran job getUserStatistics with:
Input: "{}"
Result: Counted all User Accounts.
Even more baffling to me, if I add:
query.limit(10);
...the query itself actually fails! (I would expect it to count 10 users.)
That said, if there is a simpler way to trigger an update on all the users in a Parse application, I'd love to hear it!
The reference actually says that:
The query may not have any sort order, and may not use limit or skip.
https://parse.com/docs/js/api/symbols/Parse.Query.html#each
So forget about "query.limit(10)", that's not relevant here.
Anyways, by their example for a background job, it seems you might have forgotten to put return in your "each" function. Plus, you called console.log('Found '+counter+' users.'); out side of your asynchronous task, that makes sense why you get 0 results. maybe you try:
query.each(function(user) {
counter = counter+1;
// you'll want to save your changes for each user,
// therefore, you will need this
return user.save();
}).then(function() {
// Set the job's success status
status.success("Counted all User Accounts.");
// console.log inside the asynchronous scope
console.log('Found '+counter+' users.');
}, function(error) {
// Set the job's error status
status.error("Failed to Count User Accounts.");
});
You can check again Parse's example of writing this cloud job.
https://parse.com/docs/js/guide#cloud-code-advanced-writing-a-background-job

How many data can save in one Parse.Object.saveAll request? And how many number of request will be used for one Parse.Object.saveAll

Recently, I have some test on parse.com. I am now facing a problem of using Parse.Object.saveAll in background job.
From the document in parse.com, it says that a background job can run for 15 minutes. I am now setting a background job to pour the data in the database using the following code:
Parse.Cloud.job("createData", function(request, status) {
var Dummy = Parse.Object.extend("dummy");
var batchSaveArr = [];
for(var i = 0 ; i < 50000 ; i ++){
var obj = new Dummy();
// genMessage() is a function to generate a random string with 5 characters long
obj.set("message", genMessage());
obj.set("numValue",Math.floor(Math.random() * 1000));
batchSaveArr.push(obj);
}
Parse.Object.saveAll(batchSaveArr, {
success: function(list){
status.success("success");
},
error: function(error){
status.error(error.message);
}
});
});
Although it is used to pour data into database, the main purpose is to test the function Parse.Object.saveAll. When I run this job, an error "This application has exceeded its request limit." is appeared in the log. However, when I see the analysis page, it show me that the request count is less than or equal to 1. I only run this job in Parse, and no other request is made during the background is running.
It seems that there is some problem on Parse.Object.saveAll. Or maybe I have some misunderstanding on this function.
Are there anyone facing the same problem?
How many data can save in one Parse.Object.saveAll request?
How many number of request will be used for one Parse.Object.saveAll
I have asked the question in Facebook and the reply is quite disappointed.
Please follow the link:
https://developers.facebook.com/bugs/439821726158766/

Resources