Spring #scheduler vs. Quartz Scheduler which one is best - spring

I got in to a task where we have to publish Metadata of one table to another application via REST web services.
Basically need is
It has to be weekly scheduler and every week we push the data to
them.
Synchronous way.
Job scheduler will kick up the job and call REST client.
I am thinking using spring batch scheduler as it is simple and not with Quartz scheduler. Let me know you view and perspectives.

Related

Asynchronous Kafka consumer in Spring Batch Application

In our Spring Batch application workers, item processors are further interacting with another service asynchronously through Kafka. The requirement here is we required an acknowledgement in order to retry failed batches and the condition is to not wait for the acknowledgement.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ?
Is it possible to rerun specific local worker step in rerun of job?
We implement producers and consumers over same step using Spring batch decider. Thus, during the first run it will only produce Kafka and on second run it will consume the Kafka.
We are looking for solution where we can asynchronously consume Kafka in Spring batch application in order to rerun specific worker step.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ? Is it possible to rerun specific local worker step in rerun of job?
According to your diagram, you are doing that call from an item processor. The closest "feature" you can get from Spring Batch is the AsyncItemProcessor. This is a special processor that processes items asynchronously in a separate thread. The callback is unwrapped in an AsyncItemWriter with the result of the call.
Other than that, I do not see any other obvious way to do that with a built-in feature from Spring Batch. So you would have to manage that in a custom ItemProcessor.

Spring boot job scheduler start and stop

I am new to spring boot scheduler. I want to schedule multiple jobs to start and stop based on user requirements. The start and stop have to be exposed as a restful service. Is there a way to achieve this using a spring scheduler where I can start a job when the user requests it and stop it when a stop is requested?
If you need to trigger method as new thread annotate with #Async.
If need trigger job based on user trigger and stop based on user trigger spring batch good option if its allowed
running spring batch job
jobLauncher.run(job, jobParameter);
stop running job.
jobexplorer.stop(jobExecution.getId);

Schedule jobs for quartz scheduler from other spring boot application

I have two spring boot applications, where one features the creating and scheduling of the job. There is another Spring Boot application where I configured the Quartz Scheduler, which will prepare the job parameters using a shared database and launches the spring batch job.
I need to update the running Quartz Scheduler if the user updates or adds a new job from UI. Also if server restarts I need to restart the Scheduler and the Jobs.
How should I update my Quartz Scheduler object when there's the new job added or updated by the user? My Quartz Scheduler will always be in running condition. Can I use REST Template so that my UI application will notify my scheduler application for the jobs?
You will need to store the jobs to schedule in a database (using the share database) from the UI application, create a job and build a trigger for it.
Its hard to say without any code given, but there is a guide that does something like you want.
https://www.callicoder.com/spring-boot-quartz-scheduler-email-scheduling-example/
This is a dynamic scheduler/trigger

Configuring spring Batch tasks in Spring cloud data flow

I have created a project with 2 rest API that launches different Jobs. My project is connected to a MySql database. I would like to monitor both the Jobs in spring cloud data flow. Please help me out how we need to configure SCDF to MySql so that both the Jobs will be monitored. And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance. If not, please let me know how we can do that.
Thanks in Advance
Please take a moment to read the Spring Batch Admin to SCDF migration guide. It is a requirement that the jobs are wrapped with Spring Cloud Task programming model.
Once you have the batch-job wrapped as a Task, you can register them in SCDF to build Task/Batch pipelines using SCDF's DSL or the GUI.
As for the datasource, all you must have to make sure is that the same datasource is shared between SCDF and the batch-jobs. With this, SCDF's Dashboard will automatically list the jobs and its execution details.
Here are a few examples for your reference.
And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance
Assuming you're referring to SCDF's Task launch API (e.g., a Scheduled trigger or by other means); if triggered, yes, the job executions will be captured in the database as far as SCDF and the batch-jobs share a common datasource as explained previously.

Spring scheduled task with jms

I'm just starting out with Spring (specifically I'm staring with Spring Boot) and want to create long running program that works on a scheduled task (i.e. #Scheduled), e.g. start processing between 7pm and 11pm. I'm ok with this bit.
The task will take a message from an ActiveMQ queue and process it, sleep a little, then get another and repeat.
Being new to JMS/ActiveMQ also, is it possible to use the Spring #JmsListener in conjunction with the scheduler to achieve this, and if so how?
If not, I take it my scheduled task should simply use point to point access to the queue to pull messages off. If so, does anyone have a simple example as I prefer to use Spring boot but can't find any good examples, they all seem to use listeners.
thanks.

Resources