Can spring provide concept like job queue - spring

Example like I have a 25 job and i want to execute 3 job concurrent and after one of the three job complete then next one pick-up from queue.

You can do this with classes in the standard Java library - you don't need Spring for this. Use an ExecutorService, for example:
class MyJob implements Runnable {
private final String message;
MyJob(String message) {
this.message = message;
}
#Override
public void run() {
System.out.println(message);
}
}
public class Example {
public static void main(String[] args) {
// Executor service with 3 threads
ExecutorService executorService = Executors.newFixedThreadPool(3);
// Submit jobs to be executed
executorService.execute(new MyJob("testing"));
executorService.execute(new MyJob("one"));
executorService.execute(new MyJob("two"));
executorService.execute(new MyJob("three"));
// ...
}
}

Yes Spring provides support for Job scheduling through Quartz Scheduler. For more information about how Spring is using Quartz, you can go through the official spring documentation.
Apart from this, if you want some ready made example, you can go through, Spring 3 + Quartz Scheduler and Spring 4 + Quartz Scheduler.

I suggest you to use the Spring boot:
Here is a good start with scheduling using Annotation with Spring
https://spring.io/guides/gs/scheduling-tasks/
Here is a nice Introduction for Spring Boot with Quartz: http://de.slideshare.net/davidkiss/spring-boot-with-quartz
Good Luck!

Related

Implement Quartz for Cron Scheduling with Spring Scheduler

#Component
public class QuartzConfig implements Job{
#Autowired
private JobService jobService;
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
System.out.println("Check Status");
jobService.checkQueueStatus();
}
In quartz.properties file I have added this detail:
# thread-pool
org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount=2
org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread=true
# job-store
# Enable this property for RAMJobStore
org.quartz.jobStore.class=org.quartz.simpl.RAMJobStore
How can I give the details about the job and trigger for Cron Scheduling?
Please help with the detailed flow.
There are plenty of examples on the web showing you how to integrate Quartz and Spring. You may want to check one of our test web apps available on GitHub:
https://github.com/quartzdesk/quartzdesk-test-webapps/blob/master/quartzdesk-test-quartz-v2-4-x/quartzdesk-test-quartz-v2-4-x-logback/src/main/webapp/WEB-INF/spring/applicationContext.xml
Search for "quartzScheduler" to find the Quartz scheduler bean and then scroll down to see jobDetails and triggers attributes.

Server Sent Event with SQL Database connection using Spring Boot

I want to implement Server Sent Events in Spring Boot. The data is there in SQL database, It is resulting in blocking connection. Web Flux is an option but it is supported only for some NoSQL database.
Yes, you right WebFlux framework doesn't SQL databases in the non blocking mode because reactive drivers does not exist.
But WebFlux provides some instruments to avoid blocking our main threads while we are making blocking long queries to a database.
1) Create configuration with Scheduler where count of threads equals to pool size:
#Configuration
public class SchedulerConfiguration {
#Value("${spring.datasource.maximum-pool-size}
private final Integer connectionPoolSize;
#Bean
#Qualifier("jdbcScheduler")
public Scheduler jdbcScheduler() {
return Schedulers.fromExecutor(Executors.newFixedThreadPool(connectionPoolSize));
}
}
2) Inject your "jdbcScheduler" to the service class:
#Service
public class DataService {
#Autowired
private final DataRepository jdbcRepository;
#Autowired #Qualifier("jdbcScheduler")
private final Scheduler scheduler;
public Mono<String> findById(long id) {
return async(() -> repository.findById(id));
}
private <T> Mono<T> async(Callable<T> callable) {
return Mono.fromCallable(callable).publishOn(scheduler);
}
}
Wrap your blocking method by Mono.fromCallable and delegate from main thread to your "scheduler" via Mono.publishOn
More about schedulers you can read here: Threading and Schedulers
Yes you can achieve asynchronous processing in spring without flux by using their inbuilt #Async processing, here how you can achieve it.
Step1: Enable Aysnc and define a bean for Executor. You can define separate configuration or directly under Main application class.
#SpringBootApplication
#EnableAsync
public class Application {
public static void main(String[] args) {
// close the application context to shut down the custom ExecutorService
SpringApplication.run(Application.class, args).close();
}
#Bean
public Executor asyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(2);
executor.setMaxPoolSize(2);
executor.setQueueCapacity(500);
executor.setThreadNamePrefix("GithubLookup-");
executor.initialize();
return executor;
}
STEP2:
Following is the simple way to configure a method with void return type to run asynchronously, You can also invoke method by retrieve the result of the asynchronous process using the Future object.
#Async
public void asyncMethodWithVoidReturnType() {
System.out.println("Execute method asynchronously. "
+ Thread.currentThread().getName());
}
For more information, You can visit Spring official guide Spring Async

Does a stand-by Quartz Scheduler require some kind of refresh for starting?

I'm using Quartz with Spring Boot, specifically using it's conveniences through the SchedulerFactoryBean abstraction. See https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-quartz.html.
I need to start the Scheduler manually instead, for which I think of using schedulerFactoryBean.setAutoStartup(false) and then Scheduler.start() in a class with an Scheduler instance autowired. See example code below.
#Configuration
public class QuartzConfig implements SchedulerFactoryBeanCustomizer {
public static final int STARTUP_DELAY = 5;
#Autowired
private DataSource dataSource;
#Override
public void customize(SchedulerFactoryBean schedulerFactoryBean) {
schedulerFactoryBean.setDataSource(dataSource);
schedulerFactoryBean.setStartupDelay(STARTUP_DELAY);
schedulerFactoryBean.setAutoStartup(false);
schedulerFactoryBean.setWaitForJobsToCompleteOnShutdown(TRUE);
}
}
#Service
public class SchedulerService {
private final Scheduler scheduler;
#Autowired
public SchedulerService(Scheduler scheduler) {
this.scheduler = scheduler;
}
public void startScheduler() {
this.scheduler.start();
}
}
However, there will be other instances in other nodes running on the database. My question is, does this not-started Scheduler instance refresh or becomes aware of all the work done by the other instance? Does a stand-by Quartz Scheduler require some kind of refresh manually triggered for starting?
For example, if a trigger, job or something else is added, changed or deleted will the not-started one be aware (cache refresh?) of it after the code calls .start()?
I have read several pages of quartz and I couldnt really find an aswer for this. I couldnt find a place that explains if or how an instance sincronizes itself with external changes done to its database.

Spring boot acitvemq keep receiver running periodically

I have configured a spring boot application which when run reads messages from the queue and processes them accordingly.
I also have configured the concurrency flag to run multiple such readers.
However in an ideal world i would like the receiver to keep running like a thread and keep checking for any messages.
My question is that whether there is any way i can configure this in spring boot or i have to fallback to using threading mechanism using executor or anything else.
Thanks,
- Vaibhav
I found a nice way from Spring Boot, the concurrency was of course taken case by concurrent attribute e.g.
#JmsListener(destination = "myqueue", concurrency="2-10")
However for the Thread part below was something which is a neat way:
#SpringBootApplication
#EnableAutoConfiguration(exclude={MongoAutoConfiguration.class, MongoDataAutoConfiguration.class})
#EnableJms
public class MyApplication implements CommandLineRunner{
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
#Override
public void run(String... arg0) throws Exception {
// TODO Auto-generated method stub
System.out.println("Joining Thread ctrl+c to bring down application");
Thread.currentThread().join();
}
}

how to select which spring batch job to run based on application argument - spring boot java config

I have two independent spring batch jobs in the same project because I want to use the same infrastructure-related beans. Everything is configured in Java. I would like to know if there's a proper way to start the jobs independent based for example on the first java app argument in the main method for example. If I run SpringApplication.run only the second job gets executed by magic.
The main method looks like:
#ComponentScan
#EnableAutoConfiguration
public class Application {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(Application.class);
app.setWebEnvironment(false);
ApplicationContext ctx= app.run(args);
}
}
and the two jobs are configured as presented in the Spring Batch Getting Started tutorial on Spring.io. Here is the configuration file of the first job, the second being configured in the same way.
#Configuration
#EnableBatchProcessing
#Import({StandaloneInfrastructureConfiguration.class, ServicesConfiguration.class})
public class AddPodcastJobConfiguration {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
//reader, writer, processor...
}
To enable modularization I created an AppConfig class, where I define factories for the two jobs:
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
P.S. I am new to Spring configuration in Java configuration Spring Boot and Spring Batch...
Just set the "spring.batch.job.names=myJob" property. You could set it as SystemProperty when you launch your application (-Dspring.batch.job.names=myjob). If you have defined this property, spring-batch-starter will only launch the jobs, that are defined by this property.
To run the jobs you like from the main method you can load the the required job configuration bean and the JobLauncher from the application context and then run it:
#ComponentScan
#EnableAutoConfiguration
public class ApplicationWithJobLauncher {
public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, InterruptedException {
Log log = LogFactory.getLog(ApplicationWithJobLauncher.class);
SpringApplication app = new SpringApplication(ApplicationWithJobLauncher.class);
app.setWebEnvironment(false);
ConfigurableApplicationContext ctx= app.run(args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
JobParameters jobParameters = new JobParametersBuilder()
.addDate("date", new Date())
.toJobParameters();
if("1".equals(args[0])){
//addNewPodcastJob
Job addNewPodcastJob = ctx.getBean("addNewPodcastJob", Job.class);
JobExecution jobExecution = jobLauncher.run(addNewPodcastJob, jobParameters);
} else {
jobLauncher.run(ctx.getBean("newEpisodesNotificationJob", Job.class), jobParameters);
}
System.exit(0);
}
}
What was causing my lots of confusion was that the second job were executed, even though the first job seemed to be "picked up" by the runner... Well the problem was that in both job's configuration file I used standard method names writer(), reader(), processor() and step() and it used the ones from the second job that seemed to "overwrite" the ones from the first job without any warnings...
I used though an application config class with #EnableBatchProcessing(modular=true), that I thought would be used magically by Spring Boot :
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
I will write a blog post about it when it is ready, but until then the code is available at https://github.com/podcastpedia/podcastpedia-batch (work/learning in progress)..
There is the CommandLineJobRunner and maybe can be helpful.
From its javadoc
Basic launcher for starting jobs from the command line
Spring Batch auto configuration is enabled by adding #EnableBatchProcessing (from Spring Batch) somewhere in your context. By default it executes all Jobs in the application context on startup (see JobLauncherCommandLineRunner for details). You can narrow down to a specific job or jobs by specifying spring.batch.job.names (comma separated job name patterns).
-- Spring Boot Doc
Or disable the auto execution and run the jobs programmatically from the context using a JobLauncher based on the args passed to the main method

Resources