I am trying to trigger a spring batch execution from endpoint. I have implemented a service at the backend. So from Vue i am trying make a call to that endpoint.
async trigger(data) {
let response = await Axios.post('')
console.log(response.data.message)
}
My service at the backend returns a response " Batch started" and does execution in the background since it is async but does not return back once job is executed(i see the status only in console). In such scenario how can i await the call from vue for the service execution to complete. I understand that service send no response once execution is complete/failed. Any changes i need to make either at the backend or frontend to support this. Please let me know your thoughts.
It's like you said, the backend service is asynchronous, which means that once the code has been executed, it moves to the next line. If no next line exits, the function exists, the script closes, and the server sends an empty response back to the frontend.
Your options are:
Implement a websocket that broadcasts back when the service has been completed, and use that instead.
Use a timeout function to watch for a flag change within the service that indicates that the service has finished its duties, or
Don't use an asynchronus service
how can i await the call from vue for the service execution to complete
I would not recommend that since the job may take too long to complete and you don't want your web client to wait that long to get a reply. When configured with an asynchronous task executor, the job launcher returns immediately a job execution with an Id which you can inspect later on.
Please check the Running Jobs from within a Web Container for more details and code examples.
My suggestion is that you should make the front-end query for the job status instead of waiting for the job to complete and respond because the job may take very long to complete.
Your API to trigger the job start should return the job ID, you can get the job ID in the JobExecution object. This object is returned when you call JobLauncher.run.
You then implement a Query API in your backend to get the status of the job by job ID. You can implement this using the Spring JobExplorer.
Your front-end can then call this Query API to get the job status. You should do this in an interval (E.g. 30 secs, 5 mins, .etc depending on your job). This will prevent your app from stuck in waiting for the job and time-out errors.
Related
I have a simple integration flow that poll data based on a cron job from database, publish on a DirectChannel, then do split and transformations, and publish on another executor service channel, do some operations and finally publish to an output channel, its written using dsl style.
Also, I have an endpoint where I might receive an http request to trigger this flow, at this point I send the messages one of the mentioned channels to trigger the flow.
I want to make sure that the manual trigger doesn’t happen if the flow is already running due to either the cron job or another request.
I have used the isRunning method of the StandardIntegrationFlow, but it seems that it’s not thread safe.
I also tried using .wireTap(myService) and .handle(myService) where this service has an atomicBoolean flag but it got set per every message, which is not a solution.
I want to know if the flow is running without much intervention from my side, and if this is not supported how can I apply the atomic boolean logic on the overall flow and not on every message.
How can I simulate the racing condition in a test in order to make sure my implementation prevent this?
The IntegrationFlow is just a logical container for configuration phase. It does have those lifecycle methods, but only for an internal framework logic. Even if they are there, they don't help because endpoints are always running if you want to do them something by some event or input message.
It is hard to control all of that since it is in an async state as you explain. Even if we can stop a SourcePollingChannelAdapter in the beginning of that flow to let your manual call do do something, it doesn't mean that messages in other threads are not in process any more. The AtomicBoolean cannot help here for the same reason: even if you set it to true in the MessageSourceMutator.beforeReceive() and reset back to false in its afterReceive() when message is null, it still doesn't mean that messages you pushed down in other thread are already processed.
You might consider to use an aggregator for AtomicBoolean resetting in the end of batch since you mention that you pull data from DB, so perhaps there is a number of records per poll you can track downstream. This way your manual call could be skipped until aggregator collects results for that batch.
You also need to think about stopping a SourcePollingChannelAdapter at the moment when manual action is permitted, so there won't be any further race conditions with the cron.
We are creating an application using laravel like example.com. On our application, there is a post api "example.com/api/order-place". In this api, we store some data on our database and send a successful response to our customer. We also call a third-party application to get some others data (third-party.com/api/get-data) on the same request. We are using a Queue job to get this data, without hampering the main order place request journey.
But sometimes the third-party api service is down. At that time, we want to store that third-party api call in some place (queue) and when the third-party application service is up, we want to process all queued jobs.
How could we achieve this? is it possible to solve this problem using laravel queue? Like when third-party applications are down, we hold our queue, and also when the third-party application is up, process these jobs.
We can do this using queue retry on failed jobs. But don't wants that. We just wants to holt a queue when third-party application is down
Should works like this
Create helper function to detect 3rd party api if up or down
Create a class or helper function for processing the 3rd party API request, throw an error if the request to 3rd party API fail ( you might want to call #1)
Create a job with no retries (must only run once) and call #2 in handle
Push your Job on different queue e.i. WhatEverJob::dispatch()->onQueue('apirequest');. You should also process this queue in your supervisor worker e.i
php artisan queue:work --queue=default,apirequest
Create a Task Scheduler (cron) that runs every hour, minute or whatever. This should first run #1 (or query the failed_jobs first whatever suits you) and exit if its down, if its up, then proceed to querying the failed_jobs table, only pull data with column queue value apirequest. you can do additional filter with payload->displayName which should be your Job Class, you can also check payload->maxTries and do something if its more than number of tries. then manually retry each failed entry.
sample;
if ( ThirdPartyAPI::is('down') )
return;
$failedEntries = DB::table('failed_jobs') ->where('queue', 'apirequest')->pluck('uuid');
if ( !$failedEntries )
return;
$uuids = implode(' ', $failedEntries);
\Artisan::call('queue:retry '. $uuids);
We have a middleware that depends on another system to execute payment requests. This third-party system usually sends a webhook later when a payment request is performed from our end and successfully done at their end after processing. Sometimes they failed or significantly delayed sending webhook and there is no retry mechanism at their end. However, they have a status query API at their end to know the current status of the payment request.
We update our payment status based on this webhook and this is very vital for our system. Now for the use case, we have found two ways to handle this failed webhook
Run a scheduler to cater failed webhook requests and check with their status query API
Implement a Queue, where a new entry will be added to the queue when an original payment request took place and fire status query API Using Time-out events eg. SQS.
The above way around has its own pros and cons. Is there any other way around to handle this use case? If no, which one of two would be the best choice.
One option is to use an orchestrator like temporal.io to implement your business logic. The code to act on the webhook as well as poll the status API in parallel would be pretty simple.
i'm starting a project in spring batch, my plan is like the following:
create spring boot app
expose an api to submit a job(without executing), that will return the job execution id to be able to track the progress of the job later by other clients
create a scheduler for running a job - i want to have a logic that will decide how many jobs i can run at any moment.
the issue is that my batch service may receive many requests for starting jobs, and i want to put the job exeuctuion in a pending status first, then a scheduler later on will check the jobs in pending status and depending on my logic will decide if it should run another set jobs.
is that possible to do in spring batch, or i need to implement it from scratch ?
The common way of addressing such a use case is to decouple job submission from job execution using a queue. This is described in details in the Launching Batch Jobs through Messages section. Your controller can accept job requests and put them in a queue. The scheduler can then control how many requests to read from the queue and launch jobs accordingly.
Spring Batch provides all building blocks (JobLaunchRequest, JobLaunchingMessageHandler, etc) to implement this pattern.
I have an application that must log all events that the user does in a remote database, so,I´ve choice to use the webservice format (the application call the webservice with the event parameters).
So, i did a remote EJB to perform that, but it is running with a bad performance, because the application needs to wait for the webservice´s response to proceed the request.
Is JMS an alternative?
What you suggest?
Thanks.
JMS will be much lighter & can asynchronously process the events. They can be used to capture application events or audit logs for the activities occurring in the system. Can send a message to a queue with proper details & those can be fetched at receiving end to process further.
If you are using EJB-3.1, then can annotate your method with #Asynchronous which returns AsyncResult implementation of Future which can be used to retrieve result, but can also be used with methods returning void.