How to Shutdown Spring Batch Job Gracefully? (currently running jobs should complete) - spring

I am having a spring batch application and it contains multiple jobs. I need to shutdown the application gracefully when sending the KILL signal from the command line. It should wait until the currently running job completes. (In the mean time, it should not accept new jobs from scheduler).
This is my Spring Boot Application configuration.
#SpringBootApplication
#EnableScheduling
#EnableBatchProcessing
public class SpringBootBatchExampleApplication {
public static void main(String[] args) {
SpringApplicationBuilder app = new SpringApplicationBuilder(SpringBootBatchExampleApplication.class)
.web(WebApplicationType.NONE);
app.registerShutdownHook(true);
app.build().addListeners(new ApplicationPidFileWriter("/shutdown.pid"));
app.run(args);
}
#Bean
TaskSchedulerCustomizer taskSchedulerCustomizer() {
return taskScheduler -> {
taskScheduler.setWaitForTasksToCompleteOnShutdown(true);
};
}
#PreDestroy
public void onDestroy() throws Exception{
Thread.sleep(5000);
}
}
Whenever i send the application kill in the middle of job execution, signal it calls the destroy method and waits for 5 seconds.
After it starts to resume the job execution and suddenly terminate the whole process with some exception due to the ApplicationContext is already closed. So the current job cannot be executed further.
Is there any way to hold the application context close/destroy until the pending/currently running jobs get completed?

Related

Spring Scheduled should start immediate after application startup

I have a requirement to add schedulers, which will run on a daily basis, but at the same time, I want to run the scheduler on the application startup. But the problem is schedular is not running immediately after application startup.
You can implement the ApplicationRunner interface and execute your business logic in the run method
#Component
public class TaskRun implements ApplicationRunner {
#Override
public void run(ApplicationArguments args) throws Exception {
// do something
}
}
Finally, I solved this by using a listener in the Application.java
#EventListener(ApplicationReadyEvent.class)
public void doSomethingOnceAppIsReady() {
//Calling a schedular method
mySchedular();
}

Server Sent Event with SQL Database connection using Spring Boot

I want to implement Server Sent Events in Spring Boot. The data is there in SQL database, It is resulting in blocking connection. Web Flux is an option but it is supported only for some NoSQL database.
Yes, you right WebFlux framework doesn't SQL databases in the non blocking mode because reactive drivers does not exist.
But WebFlux provides some instruments to avoid blocking our main threads while we are making blocking long queries to a database.
1) Create configuration with Scheduler where count of threads equals to pool size:
#Configuration
public class SchedulerConfiguration {
#Value("${spring.datasource.maximum-pool-size}
private final Integer connectionPoolSize;
#Bean
#Qualifier("jdbcScheduler")
public Scheduler jdbcScheduler() {
return Schedulers.fromExecutor(Executors.newFixedThreadPool(connectionPoolSize));
}
}
2) Inject your "jdbcScheduler" to the service class:
#Service
public class DataService {
#Autowired
private final DataRepository jdbcRepository;
#Autowired #Qualifier("jdbcScheduler")
private final Scheduler scheduler;
public Mono<String> findById(long id) {
return async(() -> repository.findById(id));
}
private <T> Mono<T> async(Callable<T> callable) {
return Mono.fromCallable(callable).publishOn(scheduler);
}
}
Wrap your blocking method by Mono.fromCallable and delegate from main thread to your "scheduler" via Mono.publishOn
More about schedulers you can read here: Threading and Schedulers
Yes you can achieve asynchronous processing in spring without flux by using their inbuilt #Async processing, here how you can achieve it.
Step1: Enable Aysnc and define a bean for Executor. You can define separate configuration or directly under Main application class.
#SpringBootApplication
#EnableAsync
public class Application {
public static void main(String[] args) {
// close the application context to shut down the custom ExecutorService
SpringApplication.run(Application.class, args).close();
}
#Bean
public Executor asyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(2);
executor.setMaxPoolSize(2);
executor.setQueueCapacity(500);
executor.setThreadNamePrefix("GithubLookup-");
executor.initialize();
return executor;
}
STEP2:
Following is the simple way to configure a method with void return type to run asynchronously, You can also invoke method by retrieve the result of the asynchronous process using the Future object.
#Async
public void asyncMethodWithVoidReturnType() {
System.out.println("Execute method asynchronously. "
+ Thread.currentThread().getName());
}
For more information, You can visit Spring official guide Spring Async

Can spring provide concept like job queue

Example like I have a 25 job and i want to execute 3 job concurrent and after one of the three job complete then next one pick-up from queue.
You can do this with classes in the standard Java library - you don't need Spring for this. Use an ExecutorService, for example:
class MyJob implements Runnable {
private final String message;
MyJob(String message) {
this.message = message;
}
#Override
public void run() {
System.out.println(message);
}
}
public class Example {
public static void main(String[] args) {
// Executor service with 3 threads
ExecutorService executorService = Executors.newFixedThreadPool(3);
// Submit jobs to be executed
executorService.execute(new MyJob("testing"));
executorService.execute(new MyJob("one"));
executorService.execute(new MyJob("two"));
executorService.execute(new MyJob("three"));
// ...
}
}
Yes Spring provides support for Job scheduling through Quartz Scheduler. For more information about how Spring is using Quartz, you can go through the official spring documentation.
Apart from this, if you want some ready made example, you can go through, Spring 3 + Quartz Scheduler and Spring 4 + Quartz Scheduler.
I suggest you to use the Spring boot:
Here is a good start with scheduling using Annotation with Spring
https://spring.io/guides/gs/scheduling-tasks/
Here is a nice Introduction for Spring Boot with Quartz: http://de.slideshare.net/davidkiss/spring-boot-with-quartz
Good Luck!

Spring boot acitvemq keep receiver running periodically

I have configured a spring boot application which when run reads messages from the queue and processes them accordingly.
I also have configured the concurrency flag to run multiple such readers.
However in an ideal world i would like the receiver to keep running like a thread and keep checking for any messages.
My question is that whether there is any way i can configure this in spring boot or i have to fallback to using threading mechanism using executor or anything else.
Thanks,
- Vaibhav
I found a nice way from Spring Boot, the concurrency was of course taken case by concurrent attribute e.g.
#JmsListener(destination = "myqueue", concurrency="2-10")
However for the Thread part below was something which is a neat way:
#SpringBootApplication
#EnableAutoConfiguration(exclude={MongoAutoConfiguration.class, MongoDataAutoConfiguration.class})
#EnableJms
public class MyApplication implements CommandLineRunner{
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
#Override
public void run(String... arg0) throws Exception {
// TODO Auto-generated method stub
System.out.println("Joining Thread ctrl+c to bring down application");
Thread.currentThread().join();
}
}

how to select which spring batch job to run based on application argument - spring boot java config

I have two independent spring batch jobs in the same project because I want to use the same infrastructure-related beans. Everything is configured in Java. I would like to know if there's a proper way to start the jobs independent based for example on the first java app argument in the main method for example. If I run SpringApplication.run only the second job gets executed by magic.
The main method looks like:
#ComponentScan
#EnableAutoConfiguration
public class Application {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(Application.class);
app.setWebEnvironment(false);
ApplicationContext ctx= app.run(args);
}
}
and the two jobs are configured as presented in the Spring Batch Getting Started tutorial on Spring.io. Here is the configuration file of the first job, the second being configured in the same way.
#Configuration
#EnableBatchProcessing
#Import({StandaloneInfrastructureConfiguration.class, ServicesConfiguration.class})
public class AddPodcastJobConfiguration {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
//reader, writer, processor...
}
To enable modularization I created an AppConfig class, where I define factories for the two jobs:
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
P.S. I am new to Spring configuration in Java configuration Spring Boot and Spring Batch...
Just set the "spring.batch.job.names=myJob" property. You could set it as SystemProperty when you launch your application (-Dspring.batch.job.names=myjob). If you have defined this property, spring-batch-starter will only launch the jobs, that are defined by this property.
To run the jobs you like from the main method you can load the the required job configuration bean and the JobLauncher from the application context and then run it:
#ComponentScan
#EnableAutoConfiguration
public class ApplicationWithJobLauncher {
public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, InterruptedException {
Log log = LogFactory.getLog(ApplicationWithJobLauncher.class);
SpringApplication app = new SpringApplication(ApplicationWithJobLauncher.class);
app.setWebEnvironment(false);
ConfigurableApplicationContext ctx= app.run(args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
JobParameters jobParameters = new JobParametersBuilder()
.addDate("date", new Date())
.toJobParameters();
if("1".equals(args[0])){
//addNewPodcastJob
Job addNewPodcastJob = ctx.getBean("addNewPodcastJob", Job.class);
JobExecution jobExecution = jobLauncher.run(addNewPodcastJob, jobParameters);
} else {
jobLauncher.run(ctx.getBean("newEpisodesNotificationJob", Job.class), jobParameters);
}
System.exit(0);
}
}
What was causing my lots of confusion was that the second job were executed, even though the first job seemed to be "picked up" by the runner... Well the problem was that in both job's configuration file I used standard method names writer(), reader(), processor() and step() and it used the ones from the second job that seemed to "overwrite" the ones from the first job without any warnings...
I used though an application config class with #EnableBatchProcessing(modular=true), that I thought would be used magically by Spring Boot :
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
I will write a blog post about it when it is ready, but until then the code is available at https://github.com/podcastpedia/podcastpedia-batch (work/learning in progress)..
There is the CommandLineJobRunner and maybe can be helpful.
From its javadoc
Basic launcher for starting jobs from the command line
Spring Batch auto configuration is enabled by adding #EnableBatchProcessing (from Spring Batch) somewhere in your context. By default it executes all Jobs in the application context on startup (see JobLauncherCommandLineRunner for details). You can narrow down to a specific job or jobs by specifying spring.batch.job.names (comma separated job name patterns).
-- Spring Boot Doc
Or disable the auto execution and run the jobs programmatically from the context using a JobLauncher based on the args passed to the main method

Resources