when api call fails on a laravel job - laravel

I have a background job that fetches data from google adwords. Now my issue is when I fetch the data using a background worker.
When the response its empty what it the best thing to do is there any way to re run again or what is the best approach ?
public function handle()
{
$googleService = new GoogleAds;
$data = $googleService->report()
->from('CRITERIA_PERFORMANCE_REPORT')
->during('20170101', '20170210')
->select('CampaignId, Id, Criteria, IsNegative, Clicks, Ctr, Cost, Labels')
->getObject();
if(!isset($data->result) || empty($data->result)){
//what to do when no data back ?
}
$this->transform->response($data);
}

You can throw an exception, then it will go back to your queue, and the worker will try to execute it again.
When you launch your worker, there is a --tries parameter that indicates how many time it will try to execute before it goes to the table failed_jobs.
You can check the reference in the official documentation.

Related

How can I define a "global" job variable that each processor can read/update using Spring Batch?

I have a Spring Batch job with a reader/processor/writer that reads a batch of EmailQueue records, processes/sends them, and then writes the results (success, fail) back into the EmailQueue database table. However, if during the job I have 5+ emails that fail to send (e.g. because the email API is down), I would like the processor to not attempt the send, but instead, mark the remaining EmailQueue objects as "failed" - and then store back in to the database with the writer. I would like my processor look something like the one below, but I can't figure out how to have a "global" monitor for the job that the processor can access.
It may be important to note that my appUserEmailSender.send(emailQueue) method doesn't throw an error if the email failed to send, it only stores the results in the EmailQueue object itself so I can write the results back into the EmailQueue db table.
public EmailQueue process(#NonNull EmailQueue emailQueue) {
// can this variable be defined globally for each job somewhere???
int emailFailSendCount = 0;
// if fail count less than 5, attempt to send email
if (emailFailSendCount<5) {
// send the email
EmailQueue result = appUserEmailSender.send(emailQueue);
// If failed, increase fail count
if (EmailQueueState.FAILED == result.getEmailQueueState()) {
emailFailSendCount++;
}
// if fail count > 5, don't attempt to send, just mark as "failed"
} else {
emailQueue.setEmailQueueState(EmailQueueState.FAILED);
}
return emailQueue;
}
Clearly the above code wouldn't work, but my question is can I define a "global" emailFailSendCount variable that each process can read or update on each processing step?

How can I receive real-time updates from a long asynchronous process?

I'm writing a small, internal web application that reads in form data and creates an excel file which then gets emailed to the user.
However, I'm struggling to understand how I can implement real-time updates for the user as the process is being completed. Sometimes the process takes 10 seconds, and sometimes the process takes 5 minutes.
Currently the user waits until the process is complete before they see any results - They do not see any updates as the process is being completed. The front-end waits for a 201 response from the server before displaying the report information and the user is "blocked" until the RC is complete.
I'm having difficulty understanding how I can asynchronously start the Report Creation (RC) process and at the same time allow the user to navigate to other pages of the site. or see updates happening in the background. I should clarify here that the some of the steps in the RC process use Promises.
I'd like to poll the server every second to get an update on the report being generated.
Here's some simple code to clarify my understanding:
Endpoints
// CREATE REPORT
router.route('/report')
.post(function(req, res, next) {
// Generate unique ID to keep track of report later on.
const uid = generateRandomID();
// Start report process ... this should keep executing even after a response (201) is returned.
CustomReportLibrary.createNewReport(req.formData, uid);
// Respond with a successful creation.
res.status(201);
}
}
// GET REPORT
router.route('/report/:id')
.get(function(req, res, next){
// Get our report from ID.
let report = CustomReportLibrary.getReport(req.params.id);
// Respond with report data
if(report) { res.status(200).json(report); }
else { res.status(404); }
}
CustomReportLibrary
// Initialize array to hold reports
let _dataStorage = [];
function createNewReport(data, id) {
// Create an object to store our report information
let reportObject = {
id: id,
status: 'Report has started the process',
data: data
}
// Add new report to global array.
_dataStorage.push(reportObject);
// ... continue with report generation. Assume this takes 5 minutes.
// ...
// ... update _dataStorage[length-1].status after each step
// ...
// ... finish generation.
}
function getReport(id) {
// Iterate through array until report with matching ID is found.
// Return report if match is found.
// Return null if no match is found.
}
From my understanding, CustomerReportLibrary.createNewReport() will execute in the background even after a 201 response is returned. In the front-end, I'd make an AJAX call to /report/:id on an interval basis to get updates on my report. Is this the right way to do this? Is there a better way to do this?
I think you are on the right way. HTTP 202 (The request has been accepted for processing, but the processing has not been completed) is a proper way to handle your case.
It can be done like this:
client sends POST /reports, server starts creating new report and returns:
202 Accepted
Location: http://api.domain.com/reports/1
client issues GET /reports/1 to get status of the report
All the above flow is async, so users are not blocked.

Parse.User query not working in Cloud Code

I am working on a project using Parse where I need some information calculated for each user and updated when they update their account. I created a Cloud Code trigger that does what I need whenever a user account is updated, and that is working well. However, I have about two thousand accounts that are already created that I need to update as well. After hours of trying to get a Cloud Job to work, I decided to try to simplify it. I wrote the following job to simply count the user accounts. To reiterate; I'm not actually trying to count the users, there are much more efficient ways to do that, I am trying to verify that I can query and loop over the existing user accounts. (The option to useMasterKey is in there because I will need that later.)
Parse.Cloud.job("getUserStatistics", function(request, status) {
// Set up to modify user data
Parse.Cloud.useMasterKey();
// Query for all users
var query = new Parse.Query(Parse.User);
var counter = 0;
query.each(function(user) {
counter = counter+1;
}).then(function() {
// Set the job's success status
status.success("Counted all User Accounts.");
}, function(error) {
// Set the job's error status
status.error("Failed to Count User Accounts.");
});
console.log('Found '+counter+' users.');
});
When I run the code, I get:
I2015-07-09T17:29:10.880Z]Found 0 users.
I2015-07-09T17:29:12.863Z]v99: Ran job getUserStatistics with:
Input: "{}"
Result: Counted all User Accounts.
Even more baffling to me, if I add:
query.limit(10);
...the query itself actually fails! (I would expect it to count 10 users.)
That said, if there is a simpler way to trigger an update on all the users in a Parse application, I'd love to hear it!
The reference actually says that:
The query may not have any sort order, and may not use limit or skip.
https://parse.com/docs/js/api/symbols/Parse.Query.html#each
So forget about "query.limit(10)", that's not relevant here.
Anyways, by their example for a background job, it seems you might have forgotten to put return in your "each" function. Plus, you called console.log('Found '+counter+' users.'); out side of your asynchronous task, that makes sense why you get 0 results. maybe you try:
query.each(function(user) {
counter = counter+1;
// you'll want to save your changes for each user,
// therefore, you will need this
return user.save();
}).then(function() {
// Set the job's success status
status.success("Counted all User Accounts.");
// console.log inside the asynchronous scope
console.log('Found '+counter+' users.');
}, function(error) {
// Set the job's error status
status.error("Failed to Count User Accounts.");
});
You can check again Parse's example of writing this cloud job.
https://parse.com/docs/js/guide#cloud-code-advanced-writing-a-background-job

How many data can save in one Parse.Object.saveAll request? And how many number of request will be used for one Parse.Object.saveAll

Recently, I have some test on parse.com. I am now facing a problem of using Parse.Object.saveAll in background job.
From the document in parse.com, it says that a background job can run for 15 minutes. I am now setting a background job to pour the data in the database using the following code:
Parse.Cloud.job("createData", function(request, status) {
var Dummy = Parse.Object.extend("dummy");
var batchSaveArr = [];
for(var i = 0 ; i < 50000 ; i ++){
var obj = new Dummy();
// genMessage() is a function to generate a random string with 5 characters long
obj.set("message", genMessage());
obj.set("numValue",Math.floor(Math.random() * 1000));
batchSaveArr.push(obj);
}
Parse.Object.saveAll(batchSaveArr, {
success: function(list){
status.success("success");
},
error: function(error){
status.error(error.message);
}
});
});
Although it is used to pour data into database, the main purpose is to test the function Parse.Object.saveAll. When I run this job, an error "This application has exceeded its request limit." is appeared in the log. However, when I see the analysis page, it show me that the request count is less than or equal to 1. I only run this job in Parse, and no other request is made during the background is running.
It seems that there is some problem on Parse.Object.saveAll. Or maybe I have some misunderstanding on this function.
Are there anyone facing the same problem?
How many data can save in one Parse.Object.saveAll request?
How many number of request will be used for one Parse.Object.saveAll
I have asked the question in Facebook and the reply is quite disappointed.
Please follow the link:
https://developers.facebook.com/bugs/439821726158766/

How to do bulk oracle data insert in web application (JSP)?

I created jsp web apps to perform 4 millions data insert. The insertion process using loop from another table, so i did select query -> loop -> insert to other table, but everytime i run this, the page load slowly an ended with timeout. So not all data was successfully inserted
I had tried to use bul data insert, it didnt hep me.
btw this is the sample code :
pstatement = connection.prepareStatement(insertquery);
pstatement.setString(1, request.getParameter("promo"));
while (rset.next()) {
pstatement.setString(1, rset.getString(1));
pstatement.setString(2, request.getParameter("promo"));
pstatement.addBatch();
out.print(rset.getString(1) + " Added<br>");
if (++countbatch % batchSize == 0) {
pstatement.executeBatch();
}
}
pstatement.executeBatch();
Another try using select trick, it also didnt help
String insertquery = "INSERT INTO datapin (msisdn,nama_promo) SELECT msisdnlist.msisdn AS msisdn, ? AS nama_promo FROM msisdnlist ";
pstatement = connection.prepareStatement(insertquery);
pstatement.setString(1, request.getParameter("promo"));
pstatement.executeQuery();
Anyone have better idea ?? Thanks :)
Is it required that the jsp page waits for the 4M inserts to complete?
I think waiting for a long lasting operation is always tricky in a web app.
You might start a thread in the request handler that will do the inserts async.
The request handler could return a token.
Then this token could be used by a second JSP page to periodically check the status of the thread.

Resources