How to use Spring batch , Quartz, scheduler - spring-boot

I'm trying to figure out how to implement Spring Batch + Quartz or Scheduler to the following business logic.
Environment:
I have a reservation database table which multiple reservations can be created by a single client (Client Table(One): Reservation Table(Many) relations)
Business logic:
In a specific reservation state, the client is suppose to receive a notification by email at (reservation state that has been updated by an admin update time) + 1hour
Is there a simple example that i can refer to?
I tried using Quartz library but couldn't quite understand the concept of its use-cases and was not able to achieve what I was planning to develop

Spring Batch does not provide support to schedule job executions. Once you have defined your Spring Batch job, it is up to you to use the library you want to schedule its execution when needed.
If you plan to use the scheduling capabilities provided by Spring Framework, you can create a scheduled method as follows:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
#Component
public class JobScheduler {
private final Job job;
private final JobLauncher jobLauncher;
#Autowired
public JobScheduler(Job job, JobLauncher jobLauncher) {
this.job = job;
this.jobLauncher = jobLauncher;
}
#Scheduled(cron = "*/10 * * * * *")
public void launchJob() throws Exception {
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis())
.toJobParameters();
this.jobLauncher.run(this.job, jobParameters);
}
}
For quartz, you can refer to the quick start guide: http://www.quartz-scheduler.org/documentation/quartz-2.3.0/quick-start.html

Related

How to handle Async and Transactional in spring boot . As is throwing exceptions when handling two many requests?

How to have #Async and #Transactional on a method to save large number of records to database.
when multiple getting this error
"Error creating bean with name 'entityManagerFactory': Singleton bean creation not allowed while singletons of this factory are in destruction (Do not request a bean from a BeanFactory in a destroy method implementation!)".
I guess some bean is getting closed or destoryed during processing.
should use Propagation.REQUIRES_NEW on transaction or default propogation is fine. (like this #Transactional(propagation = Propagation.REQUIRES_NEW)).
Method is ok to have void or BaseResponse as return type is it is just fire-forgot method
import java.util.ArrayList;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Component;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
#Component
public class DbWriterService {
public static final MDALogger log = MDALoggerFactory.getMdaLogger(DbWriterService.class);
#Autowired
private StudentUploadRecordRepo studentUploadRecordRepo;
private ObjectMapper objectMapper = new ObjectMapper();
#Async
#Transactional
public StudentRosterBaseResponse loadStudentDetailsRecords(List<StudentDetailsRecord> studentDetailsRecords, Integer batchId) {
log.info("loading Student Details Records of batch = " + batchId, null);
List<StudentUploadRecord> studentUploadRecordList = new ArrayList<>();
for (StudentDetailsRecord studentDetailsRecord : studentDetailsRecords) {
StudentUploadRecord studentUploadRecord = new StudentUploadRecord();
studentUploadRecord.setStudentDetailJson(
objectMapper.convertValue(studentDetailsRecord.getStudentDetails(), JsonNode.class));
studentUploadRecord.setBatchRowNum((int)studentDetailsRecord.getRowNumber());
studentUploadRecord.setStudentUploadDetail(studentUploadDetail);
studentUploadRecordList.add(studentUploadRecord);
}
studentUploadRecordRepo.saveAll(studentUploadRecordList);
log.info("Completed data loading records of batch =" + batchId+" with number of records= "+studentDetailsRecords.size(), null);
return new StudentRosterBaseResponse();
}
}
Tried using #Async and #Transactional tested its working fine when less request are coming but failing when multiple requests are coming for the endpoint.

Spring Batch and non-daemon threads

If there is a non-daemon thread in a Spring Batch application, when the Batch terminates the application never shutsdown, i.e. the shutdown signal never reaches the JVM.
Is this expected behaviour, or does Spring Batch fail to send the signal due to a malfunction?
I attach a very simple application that reproduces the case: https://github.com/ferblaca/SpringBatchDemo
Versions:
Spring boot 2.4.5
Spring Batch 4.3.2
Java 11
This is not related to Spring Batch. Spring Batch does not prevent your JVM from shutting down. As soon as your job is finished, your JVM should terminate, otherwise there is something else preventing it from shutting down. In your case, it is your executor service that you've configured as non-daemon thread.
I took your example and removed everything related to Spring Batch, and the same thing happens:
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
import javax.annotation.PostConstruct;
import org.apache.commons.lang3.concurrent.BasicThreadFactory;
import org.springframework.boot.WebApplicationType;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
#SpringBootApplication
public class DemoBatchApplication {
public static void main(String[] args) {
new SpringApplicationBuilder().sources(com.example.demo.batch.DemoBatchApplication.class)
.web(WebApplicationType.NONE)
.run(args);
}
private final ScheduledExecutorService scheduledExecutorService = Executors
.newSingleThreadScheduledExecutor(new BasicThreadFactory.Builder()
.namingPattern("task-non-daemon-%d")
.daemon(false)
.build());
#PostConstruct
public void init() {
this.scheduledExecutorService.scheduleAtFixedRate(() -> {
System.out.println("Scheduled task non-daemon!!!!");
}, 1L, 1000L, TimeUnit.MILLISECONDS);
}
}
If you run this app, you should see the same behaviour: the scheduledExecutorService will keep running since you set it as a non-daemon thread. If you change the daemon flag to true, the JVM will stop as soon as your job is finished. Please check What is a daemon thread in Java?.

Spring integration trigger timer events [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I am still new to Spring Integration so please bear with me.
I have a use case where for each event received we have to create a set of timers that will be executed when it reaches that time frame. I am looking into Delayer but not sure if it will satisfy the above condition
Say for example when we receive a schedule event for a flight we will process, persist as per the business logic and will create 2 timers for the flight that will be executed after an hour. So when the clock reaches that hour mark it will perform some defined action.
I am thinking of "delayer" with a persistent message store but not sure if it is scalable for a huge load of 20k timers at a given time.
For debugging purposes I would also like to see the history of timer that are successfully executed and the details of the timer.
Please recommend a good approach.
If I understood correctly : You are looking for Event based Dynamic Schedulers.
You can use REST (Event) & Quartz schedulers (Scheduling). Upon hitting URL localhost/schedule it will schedule the event as per the time give in your REST request.
Refer to below code sample
import org.quartz.Scheduler;
import org.quartz.SchedulerException;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.scheduling.quartz.SchedulerFactoryBean;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseBody;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.restro.jobs.request.ScheduleJobsRequest;
import com.restro.jobs.service.SchedulerService;
#Controller
public class JobsScheduleController implements ApplicationContextAware{
#Autowired
SchedulerService service;
#Autowired
SchedulerFactoryBean schedulerFactoryBean;
ApplicationContext applicationContext;
#RequestMapping(value = "/schedule", method = RequestMethod.POST, produces = { "application/json" })
public #ResponseBody void schedule(#RequestBody ScheduleJobsRequest scheduleJobsRequest)
throws JsonProcessingException, ClassNotFoundException, SchedulerException {
Scheduler scheduler = schedulerFactoryBean.getScheduler();
scheduler.getContext().put("applicationContext", applicationContext);
service.scheduleJobs(scheduleJobsRequest.getJobName(), scheduleJobsRequest.getGroup(),
scheduleJobsRequest.getCronExpression(), scheduler);
System.out.println("scheduled");
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
}
Sample REST Scheduling request
{
"jobName":"OrderOneSettlementJob",
"group":"order",
"cronExpression":"0/30 * * * * ?"
}

How to run DelegatingSecurityContextRunnable every time when tomcat creates new Thread

I have an spring app which is using tomcat with websockets. I would like to use the DelegatingSecurityContextRunnable to be executed every time when tomcat creates a new thread, i.e. warp the tomcat thread. Does anyone know how this is done. The reason for the question can be found.here
Maybe this can be done with using AOP and some advice?
In Spring boot you can configure a Wrapper by hooking into the Tomcat connector. See this as an example:
#Bean
public EmbeddedServletContainerFactory servletContainerFactory() {
TomcatEmbeddedServletContainerFactory factory = new TomcatEmbeddedServletContainerFactory();
factory.addConnectorCustomizers(new TomcatConnectorCustomizer() {
#Override
public void customize(Connector connector) {
AbstractProtocol protocolHandler = (AbstractProtocol) connector.getProtocolHandler();
TaskQueue taskqueue = new TaskQueue() {
#Override
public boolean offer(Runnable e, long timeout, TimeUnit unit) throws InterruptedException {
return super.offer(new MyRunnable(e), timeout, unit);
}
#Override
public boolean offer(Runnable o) {
return super.offer(new MyRunnable(o));
}
};
TaskThreadFactory tf = new TaskThreadFactory("artur-" + "-exec-", false, 0);
ThreadPoolExecutor e = new ThreadPoolExecutor(10, 10, 1000, TimeUnit.SECONDS, taskqueue);
taskqueue.setParent(e);
protocolHandler.setExecutor(e);
}
});
return factory;
}
And here is my custom Runable (this can be any wrapper, i did not bother implementing exactly yours):
static class MyRunnable implements Runnable {
private Runnable r;
public MyRunnable(Runnable r) {
this.r = r;
}
#Override
public void run() {
System.out.println("Custom runable");
runInner();
}
void runInner() {
r.run();
}
}
And here are my imports:
import java.util.concurrent.TimeUnit;
import org.apache.catalina.connector.Connector;
import org.apache.coyote.AbstractProtocol;
import org.apache.tomcat.util.threads.TaskQueue;
import org.apache.tomcat.util.threads.TaskThreadFactory;
import org.apache.tomcat.util.threads.ThreadPoolExecutor;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.context.embedded.EmbeddedServletContainerFactory;
import org.springframework.boot.context.embedded.tomcat.TomcatConnectorCustomizer;
import org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainerFactory;
import org.springframework.boot.web.support.SpringBootServletInitializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.FilterType;
import org.springframework.context.annotation.PropertySource;
What this does:
The Tomcat connector initialises itself. You can set the executor to use, in which case Tomcat will stop creating its own configuration and instead use yours.
By overwriting the offer methods in the queue, you have the chance to wrap your Runnable in any custom Runnable. In my case, for testing, I simply added a Sysout to see that everything is working correctly.
The Threadpool implementation I used is an exact copy of the tomcat default (minus the properties). This way, behaviour stays the same, except that any Runnable is now your delegating wrapper.
When I test that, my console prints:
Custom runable
I hope this is what you were looking for.
I use spring boot, but this is essentially a tomcat issue not a spring issue. You can adapt the solution to your specific scenario.
-- Artur

NoSuchJobException when running a job programmatically in Spring Batch

I have a Job running on startup. I want to run this job programmatically at a particular point of my application, not when I start my app.
When running on startup I have no problem, but I got a "NoSuchJobException" (No job configuration with the name [importCityFileJob] was registered) when I try to run it programmatically.
After looking on the web, I think it's a problem related to JobRegistry, but I don't know how to solve it.
Note : my whole batch configuration is set programmatically, I don't use any XML file to configure my batch and my job. That's a big part of my problem while I lack the examples...
Here is my code to run the Job :
public String runBatch() {
try {
JobLauncher launcher = new SimpleJobLauncher();
JobLocator locator = new MapJobRegistry();
Job job = locator.getJob("importCityFileJob");
JobParameters jobParameters = new JobParameters(); // ... ?
launcher.run(job, jobParameters);
} catch (Exception e) {
e.printStackTrace();
System.out.println("Something went wrong");
}
return "Job is running";
}
My Job declaration :
#Bean
public Job importCityFileJob(JobBuilderFactory jobs, Step step) {
return jobs.get("importFileJob").incrementer(new RunIdIncrementer()).flow(step).end().build();
}
(I tried to replace importCityFileJob by importFileJob in my runBatch method, but it didn't work)
My BatchConfiguration file contains the job declaration above, a step declaration, the itemReader/itemWriter/itemProcessor, and that's all.
I use the #EnableBatchProcessing annotation.
I'm new to Spring Batch & I'm stuck on this problem. Any help would be welcome.
Thanks
Edit : I've solved my problem. I wrote my solution in the answers
Here is what I had to do to fix my problem:
Add the following Bean to the BatchConfiguration :
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor = new JobRegistryBeanPostProcessor();
jobRegistryBeanPostProcessor.setJobRegistry(jobRegistry);
return jobRegistryBeanPostProcessor;
}
Replace the JobLocator by an #Autowired JobRegistry, and use the #Autowired JobLauncher instead of creating one. My run method now have the following code :
#Autowired
private JobRegistry jobRegistry;
#Autowired
private JobLauncher launcher;
public String runBatch() {
try {
Job job = jobRegistry.getJob("importCityFileJob");
JobParameters jobParameters = new JobParameters();
launcher.run(job, jobParameters);
} catch (Exception e) {
e.printStackTrace();
System.out.println("Something went wrong");
}
return "OK";
}
I hope it will help someone.
A JobRegistry won't populate itself. In your example, you're creating a new instance, then trying to get the job from it without having registered it in the first place. Typically, the JobRegistry is configured as a bean along with an AutomaticJobRegistrar that will load all jobs into the registrar on startup. That doesn't mean they will be executed, just registered so they can be located later.
If you're using Java configuration, this should happen automatically using the #EnableBatchProcessing annotation. With that annotation, you'd just inject the provided JobRegistry and the jobs should already be there.
You can read more about the #EnableBatchProcessing in the documentation here: http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html
You can also read about the AutomaticJobRegistrar in the documentation here: http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/support/AutomaticJobRegistrar.html
I could not find the correct answer on this page. In my case the spring batch jobs were configured in a different configuration class not annotated with #EnableBatchProcessing. In that case you need to add the Job to the JobRegistry:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.configuration.DuplicateJobException;
import org.springframework.batch.core.configuration.JobRegistry;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.support.ReferenceJobFactory;
import org.springframework.batch.core.job.flow.Flow;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MyBatchJobConfigurations {
#Bean
public Job myCountBatchJob(final JobBuilderFactory jobFactory, final JobRegistry jobRegistry, final Flow myJobFlow)
throws DuplicateJobException {
final Job countJob = jobFactory.get("myCountBatchJob")
.start(myJobFlow)
.end().build();
ReferenceJobFactory referenceJobFactory = new ReferenceJobFactory(countJob);
jobRegistry.register(referenceJobFactory);
return countJob;
}
}
Adding the following bean in the applicationContext.xml resolved the problem for me
<bean class="org.springframework.batch.core.configuration.support.JobRegistryBeanPostProcessor">
<property name="jobRegistry" ref="jobRegistry" />
</bean>
I also have this entry in the applicationContext.xml
<bean id="jobRegistry"
class="org.springframework.batch.core.configuration.support.MapJobRegistry" />
another solution:
rename the method name "importCityFileJob" to "job":
#Bean
public Job job(JobBuilderFactory jobs, Step step) {
return jobs.get("importFileJob").incrementer(new RunIdIncrementer()).flow(step).end().build();
}
#EnableBatchProcessing
#Configuration
public class SpringBatchCommon {
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
JobRegistryBeanPostProcessor postProcessor = new JobRegistryBeanPostProcessor();
postProcessor.setJobRegistry(jobRegistry);
return postProcessor;
}
}
Set the JobRegistry in the JobRegistryBeanPostProcessor , after that you can autowire the JobLauncher and the JobLocator
Job job = jobLocator.getJob("importFileJob");
JobParametersBuilder jobBuilder = new JobParametersBuilder();
//set any parameters if required
jobLauncher.run(job, jobBuilder.toJobParameters()

Resources