Spring-Boot with Quartz and multiple schedulers - spring

I am working with a scenario where we have one database with multiple schemas, one schema for each customer. This allows each customer to set different schedules for their jobs. All schemas have the same set of jobs, only the schedules differ.
I need to write one Spring-Boot app to run all jobs from all schemas.
It seems like this would be done by defining different quartz.properties for each schema, and then configuring a different Scheduler for each one, like this:
#SpringBootApplication
#Configuration
public class MyApplication{
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
#Bean
public Scheduler schedulerA(Trigger trigger, JobDetail job) {
StdSchedulerFactory factory = new StdSchedulerFactory();
factory.initialize(new ClassPathResource("quartzA.properties").getInputStream());
Scheduler scheduler = factory.getScheduler();
scheduler.setJobFactory(springBeanJobFactory());
scheduler.scheduleJob(job, trigger);
scheduler.start();
return scheduler;
}
#Bean
public Scheduler schedulerB(Trigger trigger, JobDetail job) {
StdSchedulerFactory factory = new StdSchedulerFactory();
factory.initialize(new ClassPathResource("quartzB.properties").getInputStream());
Scheduler scheduler = factory.getScheduler();
scheduler.setJobFactory(springBeanJobFactory());
scheduler.scheduleJob(job, trigger);
scheduler.start();
return scheduler;
}
}
My question is, is this correct? Can I just define these schedulers in my SpringBootApplication class annotated with #Configuration, and expect it to work (assuming the properties are correct)? Am I missing anything?

My question is, is this correct? Can I just define these schedulers in
my SpringBootApplication class annotated with #Configuration
This is correct. Alternatively you can use Spring #Schelduled annotation with a Cron defined in properties files.
#Scheduled(cron = "0 15 10 15 * ?")
public void scheduleTaskUsingCronExpression() {
.
.
.
But, if you want more control over the jobs like failover, retry policy or track and run/rerun jobs from a dashboard. Think of spring-batch

Inspired by the above example I found a way to use configurations managed in application properties which seems to be easier and more consistent with the rest of the Spring-Boot app. It is particularly useful to reuse the data source configuration. Any number of beans of the second kind is possible.
#Configuration
class MainQuartzConfiguration {
/**
* Main scheduler bean where all jobDetails, calendars and trigger beans are attached.
*
*/
#Primary #Bean
public SchedulerFactoryBean mainScheduler(QuartzProperties properties,
ObjectProvider<SchedulerFactoryBeanCustomizer> customizers,
ObjectProvider<JobDetail[]> jobDetails, Map<String, Calendar> calendars,
ObjectProvider<Trigger[]> triggers, ApplicationContext applicationContext) {
SchedulerFactoryBean factory = new QuartzAutoConfiguration(properties, customizers, jobDetails, calendars, triggers, applicationContext)
.quartzScheduler();
factory.setSchedulerName("mainScheduler");
return factory;
}
}
#Configuration
class AnotherConfiguration {
/**
* Second scheduler bean which has the same configuration but different thread count and thread priority.
*/
#Bean
SchedulerFactoryBean secondScheduler(
QuartzProperties properties,
ObjectProvider<SchedulerFactoryBeanCustomizer> customizers,
#Value("${spring.quartz.properties.secondScheduler.org.quartz.threadPool.threadPriority:7}") int threadPriority,
#Value("${spring.quartz.properties.secondScheduler.org.quartz.threadPool.threadCount:1}") int threadCount,
ApplicationContext applicationContext)
{
SchedulerFactoryBean schedulerFactoryBean = new SchedulerFactoryBean();
SpringBeanJobFactory jobFactory = new SpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
schedulerFactoryBean.setJobFactory(jobFactory);
schedulerFactoryBean.setSchedulerName("secondScheduler");
schedulerFactoryBean.setAutoStartup(properties.isAutoStartup());
schedulerFactoryBean
.setStartupDelay((int) properties.getStartupDelay().getSeconds());
schedulerFactoryBean.setWaitForJobsToCompleteOnShutdown(
properties.isWaitForJobsToCompleteOnShutdown());
Properties propertiesVariant = new Properties();
propertiesVariant.putAll(properties.getProperties());
propertiesVariant.setProperty("org.quartz.threadPool.threadPriority", Integer.toString(threadPriority));
propertiesVariant.setProperty("org.quartz.threadPool.threadCount", Integer.toString(threadCount));
schedulerFactoryBean.setQuartzProperties(propertiesVariant);
schedulerFactoryBean.setJobDetails(CatalogBenchmarkJob.createJob());
customizers.orderedStream().forEach(
(customizer) -> customizer.customize(schedulerFactoryBean));
return schedulerFactoryBean;
}
}

Related

spring #Scheduled annotaion works on Local Websphere but does not work on Websphere installed on Severs

Spring #Scheduled(cron = "${cron expression}") works on the websphere in local machine but not on the non prod servers.
You should check the Thread pool in the different Webshere to check whats the difference I asume your problem might be there.
Else, a more complicated solution is to pass in a custom threadpool for your schedule tasks.
To do this create a new configuration class for the Scheduler and add a custom Threadpool
#Configuration
public class SchedulerConfig implements SchedulingConfigurer {
private final int POOL_SIZE = 10;
#Override
public void configureTasks(ScheduledTaskRegistrar scheduledTaskRegistrar) {
ThreadPoolTaskScheduler threadPoolTaskScheduler = new ThreadPoolTaskScheduler();
threadPoolTaskScheduler.setPoolSize(POOL_SIZE);
threadPoolTaskScheduler.setThreadNamePrefix("my-scheduled-task-pool-");
threadPoolTaskScheduler.initialize();
scheduledTaskRegistrar.setTaskScheduler(threadPoolTaskScheduler);
}
}
You can find more references here https://www.callicoder.com/spring-boot-task-scheduling-with-scheduled-annotation/

How to create multiple instances of a scheduler class in spring boot?

I have a class containing the #Scheduled annotated method.
I want to create multiple instances of a class in spring boot application so that I should be able to run multiple jobs for the specified time period.
I have googled and tried with creating a new object but scheduling didn't work.
Note: I will pass what to execute at runtime for respective instance.
You can create a class with multiple methods to schedule your jobs at same time, doing same job..
#Component
public class Job {
#Scheduled(initialDelay = 1000, fixedDelay = 60000)
public void job1() {
jobWork();
}
#Scheduled(initialDelay = 1000, fixedDelay = 60000)
public void job2() {
jobWork();
}
private void jobWork() {
}
}
#Scheduled is a repatable annotation so you can add multiple #Scheduled in same method:
#Scheduled(initialDelay = 1000, fixedDelay = 60000)
#Scheduled(initialDelay = 1000, fixedDelay = 60000)
public void jobWork() {
jobWork();
}
#Repeatable(value=Schedules.class)
See also Duplication on #Scheduled
Repeating Annotations as #Scheduled allow multiple annotations
Here is the answer,
I implemented ApplicationContextAware
#Override
public void setApplicationContext(ApplicationContext applicationContext)
throws BeansException {
for (int i =0;i<4;i++) {
((ConfigurableApplicationContext) applicationContext).getBeanFactory()
.registerSingleton("New Instance " + i, new SchedularJob());
}
}
It will create 4 instances of SchedularJob class and 4 schedulers will run independently.

Quartz JDBCJobStore with RoutingDataSource

For my application, we are using the spring's
org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource
The target dataSources are configured and chosen based on request's domain URL.
Eg:
qa.example.com ==> target datasource = DB1
qa-test.example.com ==> target datasource = DB2
Following is the configuration for the same
#Bean(name = "dataSource")
public DataSource dataSource() throws PropertyVetoException, ConfigurationException {
EERoutingDatabase routingDB = new EERoutingDatabase();
Map<Object, Object> targetDataSources = datasourceList();
routingDB.setTargetDataSources(targetDataSources);
return routingDB;
}
public class EERoutingDatabase extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
// This is derived from the request's URL/Domain
return SessionUtil.getDataSourceHolder();
}
}
The task is now using Quartz JDBCJobStore to store the quartz jobs/triggers.
The preferred option is using JobStoreCMT.
We used the following config
#Configuration
public class QuartzConfig {
private static final Logger LOG = LoggerFactory.getLogger(QuartzConfig.class);
private static final String QUARTZ_CONFIG_FILE = "ee-quartz.properties";
#Autowired
private DataSource dataSource;
#Autowired
private PlatformTransactionManager transactionManager;
#Autowired
private ApplicationContext applicationContext;
/**
* Spring wrapper over Quartz Scheduler bean
*/
#Bean(name="quartzRealTimeScheduler")
SchedulerFactoryBean schedulerFactoryBean() {
LOG.info("Creating QUARTZ Scheduler for real time Job invocation");
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setConfigLocation(new ClassPathResource(QUARTZ_CONFIG_FILE));
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setJobFactory(springBeanJobFactory());
factory.setWaitForJobsToCompleteOnShutdown(true);
factory.setApplicationContextSchedulerContextKey("applicationContext");
return factory;
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
jobFactory.setIgnoredUnknownProperties("applicationContext");
return jobFactory;
}
}
and following is the config in quartz properties file (ee-quartz.properties)
org.quartz.scheduler.instanceId=AUTO
org.quartz.jobStore.useProperties=false
org.quartz.jobStore.misfireThreshold: 60000
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
On starting the application, following exception occurs
Caused by: java.lang.IllegalStateException: Cannot determine target DataSource for lookup key [null]
at org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource.determineTargetDataSource(AbstractRoutingDataSource.java:202) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at com.expertly.config.EERoutingDatabase.determineTargetDataSource(EERoutingDatabase.java:60) ~[EERoutingDatabase.class:na]
at org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource.getConnection(AbstractRoutingDataSource.java:164) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:111) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:77) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:289) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:329) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.scheduling.quartz.LocalDataSourceJobStore.initialize(LocalDataSourceJobStore.java:149) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.quartz.impl.StdSchedulerFactory.instantiate(StdSchedulerFactory.java:1321) ~[quartz-2.2.2.jar:na]
at org.quartz.impl.StdSchedulerFactory.getScheduler(StdSchedulerFactory.java:1525) ~[quartz-2.2.2.jar:na]
at org.springframework.scheduling.quartz.SchedulerFactoryBean.createScheduler(SchedulerFactoryBean.java:599) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.scheduling.quartz.SchedulerFactoryBean.afterPropertiesSet(SchedulerFactoryBean.java:482) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1612) ~[spring-beans-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1549) ~[spring-
beans-4.0.1.RELEASE.jar:4.0.1.RELEASE]
It seems that
Quartz is trying to create connections with my datasource upfront.
Since my dataSource isn't concrete one (its routing dataSource) and in addition doesn't have knowledge to which target Db to connect (at config time), it fails
Do we have any provision, where quartz can be used with RoutingDataSource? If Not, what would be the next best thing?
Ideally you can try making SchedulerFactoryBean as #Lazy.
But It seems lazy initialization will not work bug, there is also a work around listed in the comments.
Create schedulerFactory bean dynamically after
ContextRefreshedEvent received on root context.
Let us know, If this works.

Spring Boot is throwing a null for JPA Repository if I use the quartz scheduler to start manually

I'm having a Quartz Scheduler as part of a Spring boot application. I'm loading the SchedulerFactory from a #Configuration file. This is the setup I have:
Config.java:
#Configuration
public class Config {
#Bean(name="stdSchedulerFactory_api")
public StdSchedulerFactory getStdSchedulerFactory() throws SchedulerException{
return new StdSchedulerFactory("/etc/quartz.properties");
}
}
The quartz.properties file contains location to a file containing the list of jobs to perform. This is how it is configured in the properties file:
org.quartz.plugin.jobInitializer.fileNames =/etc/quartz-config.xml
And the quartz-config.xml has the jobs listed as below:
<job>
<name>MasterScheduleJob</name>
<group>MasterScheduleJobGroup</group>
<description>This is Master Job Scheduler</description>
<job-class>com.MasterScheduler</job-class>
<durability>true</durability>
<recover>false</recover>
</job>
Master.java
#Component
#Scope("singleton")
public class Master {
#PostConstruct
public static void init(){
// Get SchedulerFactory context from Config file and start the scheduler
SchedulerFactory factory = null;
factory = (StdSchedulerFactory) context.getBean("stdSchedulerFactory_api");
scheduler = factory.getScheduler();
setScheduler(scheduler);
scheduler.start();
}
}
When the Master starts, it reads the Quartz-config.xml file, and in turn calls the MasterScheduler.java which is below:
MasterScheduler.java
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class MasterScheduler implements Job {
#Autowired
public JobsScheduleRepository jobsScheduleRepository;
private List<JobsScheduleDAO> jobsScheduleDAO;
#Override
public void execute(JobExecutionContext jobContext) {
jobsScheduleDAO = jobsScheduleRepository.findAll();
}
}
When I run this code, I get a null pointer exception saying jobsScheduleRepository is null. I'm suspecting this is because the Quartz scheduler starts MasterScheduler.java using new() internally. I do not want to use the Quartz scheduler provided by Spring. Is there any other alternative?
If there is a way to avoid using #Autowired and access the repository directly, I can use that as a temporary fix for now.

how to select which spring batch job to run based on application argument - spring boot java config

I have two independent spring batch jobs in the same project because I want to use the same infrastructure-related beans. Everything is configured in Java. I would like to know if there's a proper way to start the jobs independent based for example on the first java app argument in the main method for example. If I run SpringApplication.run only the second job gets executed by magic.
The main method looks like:
#ComponentScan
#EnableAutoConfiguration
public class Application {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(Application.class);
app.setWebEnvironment(false);
ApplicationContext ctx= app.run(args);
}
}
and the two jobs are configured as presented in the Spring Batch Getting Started tutorial on Spring.io. Here is the configuration file of the first job, the second being configured in the same way.
#Configuration
#EnableBatchProcessing
#Import({StandaloneInfrastructureConfiguration.class, ServicesConfiguration.class})
public class AddPodcastJobConfiguration {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
//reader, writer, processor...
}
To enable modularization I created an AppConfig class, where I define factories for the two jobs:
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
P.S. I am new to Spring configuration in Java configuration Spring Boot and Spring Batch...
Just set the "spring.batch.job.names=myJob" property. You could set it as SystemProperty when you launch your application (-Dspring.batch.job.names=myjob). If you have defined this property, spring-batch-starter will only launch the jobs, that are defined by this property.
To run the jobs you like from the main method you can load the the required job configuration bean and the JobLauncher from the application context and then run it:
#ComponentScan
#EnableAutoConfiguration
public class ApplicationWithJobLauncher {
public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, InterruptedException {
Log log = LogFactory.getLog(ApplicationWithJobLauncher.class);
SpringApplication app = new SpringApplication(ApplicationWithJobLauncher.class);
app.setWebEnvironment(false);
ConfigurableApplicationContext ctx= app.run(args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
JobParameters jobParameters = new JobParametersBuilder()
.addDate("date", new Date())
.toJobParameters();
if("1".equals(args[0])){
//addNewPodcastJob
Job addNewPodcastJob = ctx.getBean("addNewPodcastJob", Job.class);
JobExecution jobExecution = jobLauncher.run(addNewPodcastJob, jobParameters);
} else {
jobLauncher.run(ctx.getBean("newEpisodesNotificationJob", Job.class), jobParameters);
}
System.exit(0);
}
}
What was causing my lots of confusion was that the second job were executed, even though the first job seemed to be "picked up" by the runner... Well the problem was that in both job's configuration file I used standard method names writer(), reader(), processor() and step() and it used the ones from the second job that seemed to "overwrite" the ones from the first job without any warnings...
I used though an application config class with #EnableBatchProcessing(modular=true), that I thought would be used magically by Spring Boot :
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
I will write a blog post about it when it is ready, but until then the code is available at https://github.com/podcastpedia/podcastpedia-batch (work/learning in progress)..
There is the CommandLineJobRunner and maybe can be helpful.
From its javadoc
Basic launcher for starting jobs from the command line
Spring Batch auto configuration is enabled by adding #EnableBatchProcessing (from Spring Batch) somewhere in your context. By default it executes all Jobs in the application context on startup (see JobLauncherCommandLineRunner for details). You can narrow down to a specific job or jobs by specifying spring.batch.job.names (comma separated job name patterns).
-- Spring Boot Doc
Or disable the auto execution and run the jobs programmatically from the context using a JobLauncher based on the args passed to the main method

Resources