Shutting down TaskScheduler does not stop running #Scheduled method - spring

I have a class with a method annotated #Scheduled
#Component
#Slf4j
public class MyScheduler {
#Scheduled(cron = "${polling-job-cron}") //each minute
public void pollingJob() {
log.info("starting polling job...");
//some work
log.info("polling job finished.");
}
}
and a configuration for taskScheduler:
#Bean
public ThreadPoolTaskScheduler taskScheduler() {
ThreadPoolTaskScheduler scheduler = new ThreadPoolTaskScheduler();
scheduler.setPoolSize(5);
scheduler.setThreadNamePrefix("mynameofscheduler");
scheduler.setWaitForTasksToCompleteOnShutdown(true);
scheduler.setAwaitTerminationSeconds(30);
scheduler.setRejectedExecutionHandler(new ThreadPoolExecutor.AbortPolicy());
return scheduler;
}
I'm trying to use graceful shutdown by using class which waits for ContextClosedEvent :
#Component
#Slf4j
public class GracefulShutdown implements ApplicationListener<ContextClosedEvent> {
private final ApplicationContext context;
private final ThreadPoolTaskScheduler taskScheduler;
public GracefulShutdown(ApplicationContext context,
ThreadPoolTaskScheduler taskScheduler) {
this.context = context;
this.taskScheduler = taskScheduler;
}
#Override
public void onApplicationEvent(ContextClosedEvent event) {
log.info("Graceful shutdown - start");
log.info("Closing task scheduler");
taskScheduler.shutdown(); //1
taskScheduler.getScheduledThreadPoolExecutor().shutdown(); //2
log.error("Closed task scheduler");
//give k8s a chance to hit in readinessProbe and stop sending requests to this pod
try {
Thread.sleep(80000); //3
} catch (InterruptedException error) {
log.info("error while trying to sleep");
error.printStackTrace();
}
log.info("Closing spring context with startup date, {}, parent: {}, id: {}, name: {}",
context.getStartupDate(), context.getParent(), context.getId(), context.getDisplayName());
((ConfigurableApplicationContext) context).close();
log.info("Graceful shutdown - end");
}
and even though I'm closing taskScheduler and underlying taskExecutor new tasks are still ran by #Scheduled. Code of GracefulShutdown is ran when SIGTERM is send, and other that closing taskScheduler it works fine.
Graceful shutdown - start
Closing task scheduler
Closed task scheduler
starting polling job...
polling job finished
starting polling job...
polling job finished.
threadPoolPrefix is logged in front of those lines (I've cut that above as line were too long to read):
{"timeMillis":1534234560001,"thread":"mynameofscheduler","level":"INFO","loggerName":"myclassr","message":"starting polling job..."
I thought that maybe some other taskScheduler is used and I'm shutting down wrong one, but its all mynameofscheduler which is configured in #Bean

thx to M. Deinum. I was messing up with spring shutting down flow. I've fixed that by registering shutdown hook:
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(AgileStreamApplication.class, args);
Runtime.getRuntime().addShutdownHook(new Thread(new GracefulShutdownHook(context)));
}
and now I don't have to shut down taskSchedulers explicitly. It's done by spring.

Because by default the ScheduledThreadPoolExecutor will wait for all delayed scheduled tasks to finish executing, even if scheduled tasks aren't running at that time.
Try this below:
#Bean
public ThreadPoolTaskScheduler taskScheduler() {
ThreadPoolTaskScheduler scheduler = new ThreadPoolTaskScheduler() {
private static final long serialVersionUID = -1L;
#Override
public void destroy() {
this.getScheduledThreadPoolExecutor().setExecuteExistingDelayedTasksAfterShutdownPolicy(false);
super.destroy();
}
};
scheduler.setPoolSize(5);
scheduler.setThreadNamePrefix("mynameofscheduler");
scheduler.setWaitForTasksToCompleteOnShutdown(true);
scheduler.setAwaitTerminationSeconds(30);
scheduler.setRejectedExecutionHandler(new ThreadPoolExecutor.AbortPolicy());
return scheduler;
}
Then the ScheduledThreadPoolExecutor will only wait for scheduled tasks which are currently running to finish executing.

Related

Spring #Schleduled annotation does not work

I want a spring scheduled task, that runs every 10 seconds, however for some reason the task runs only once and is never repeated again.
Please do not suggest me to use other types of tasks, because I need specifically to use spring tasks.
#Scheduled(fixedRate = 10000, initialDelay = 1000)
public void myTask() {
...
}
In my main config class I have #EnableScheduling added as well.
Scheduling is a process of executing the tasks for a specific time period, but you looking to make it asynchrounslly so there will be a couple of changes
create a config class that will manage the Async operations so you make use of ThreadPoolTaskExecutor:
#EnableScheduling
#Configuration
public class TaskConfig implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar scheduledTaskRegistrar)
{
ThreadPoolTaskScheduler threadPoolTaskScheduler = new ThreadPoolTaskScheduler();
threadPoolTaskScheduler.setPoolSize(10);
threadPoolTaskScheduler.setThreadNamePrefix("your-scheduler-");
threadPoolTaskScheduler.initialize();
scheduledTaskRegistrar.setTaskScheduler(threadPoolTaskScheduler);
}
}
then you can run jobs asynchrounslly as the below :
#Component
public class HelloSender {
#Scheduled(fixedRate = 10000)
public void myTask() {
System.out.println("im running asynchronous with Worker : " + Thread.currentThread().getName());
}
}
for further information about ThreadPoolTaskExecutor please have look here: https://docs.spring.io/spring-framework/docs/3.0.x/spring-framework-reference/html/scheduling.html

How to configure graceful shutdown using DelegatingSecurityContextScheduledExecutorService with Spring

I'm trying to use the new options to do graceful shutdown with spring introduced in version 2.3, but I'm struggling to make my scheduled task to behave the same way.
As I need a valid user in the context during scheduled tasks execution, I am using DelegatingSecurityContextScheduledExecutorService to achieve this goal.
Here is a sample of my implementation of SchedulingConfigurer:
#Configuration
#EnableScheduling
public class ContextSchedulingConfiguration implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
}
#Bean
public TaskSchedulerCustomizer taskSchedulerCustomizer() {
return taskScheduler -> {
taskScheduler.setAwaitTerminationSeconds(120);
taskScheduler.setWaitForTasksToCompleteOnShutdown(true);
taskScheduler.setPoolSize(2);
};
}
#Bean
public Executor taskExecutor() {
ThreadPoolTaskScheduler threadPool = new ThreadPoolTaskScheduler();
taskSchedulerCustomizer().customize(threadPool);
threadPool.initialize();
threadPool.setThreadNamePrefix("XXXXXXXXX");
SecurityContext schedulerContext = createSchedulerSecurityContext();
return new DelegatingSecurityContextScheduledExecutorService(threadPool.getScheduledExecutor(), schedulerContext);
}
private SecurityContext createSchedulerSecurityContext() {
//This is just an example, the actual code makes several changes to the context.
return SecurityContextHolder.createEmptyContext();
}
#Scheduled(initialDelay = 5000, fixedDelay = 15000)
public void run() throws InterruptedException {
System.out.println("Started at: " + LocalDateTime.now().toString());
long until = System.currentTimeMillis() + TimeUnit.SECONDS.toMillis(30);
while (System.currentTimeMillis() < until) {}
System.out.println("Ended at: " + LocalDateTime.now().toString());
}
}
But when I send a termination signal while the sheduled task is running, the application does not wait for the task.
If in my bean taskExecutor I replace the last two lines, returning the ThreadPoolTaskScheduler without a context, everything work as expected. It only doesn't work when I return the DelegatingSecurityContextScheduledExecutorService.
How can I set the context for the taskExecutor and at the same time configure to wait for tasks to complete on shutdown?
I alredy tried several variations of this code, using another implementations of the interfaces TaskScheduler and TaskExecutor, but without success.
For starters cleanup your code and use the proper return types in the bean methods (be specific) and expose both as beans (marking one as #Primary!).
#Configuration
#EnableScheduling
public class ContextSchedulingConfiguration implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(securitytaskScheduler());
}
#Bean
public ThreadPoolTaskScheduler taskScheduler() {
ThreadPoolTaskScheduler taskScheduler= new ThreadPoolTaskScheduler();
taskScheduler.setAwaitTerminationSeconds(120);
taskScheduler.setWaitForTasksToCompleteOnShutdown(true);
taskScheduler.setPoolSize(2);
taskScheduler.setThreadNamePrefix("XXXXXXXXX");
return taskScheduler;
}
#Bean
#Primary
public DelegatingSecurityContextScheduledExecutorService securitytaskScheduler() {
SecurityContext schedulerContext = createSchedulerSecurityContext();
return new DelegatingSecurityContextScheduledExecutorService(taskScheduler().getScheduledExecutor(), schedulerContext);
}
private SecurityContext createSchedulerSecurityContext() {
//This is just an example, the actual code makes several changes to the context.
return SecurityContextHolder.createEmptyContext();
}
#Scheduled(initialDelay = 5000, fixedDelay = 15000)
public void run() throws InterruptedException {
System.out.println("Started at: " + LocalDateTime.now().toString());
long until = System.currentTimeMillis() + TimeUnit.SECONDS.toMillis(30);
while (System.currentTimeMillis() < until) {}
System.out.println("Ended at: " + LocalDateTime.now().toString());
}
}
Important is to be as specific in your return types as possible. Configuration classes are detected early on and the return types are checked to determine the callbacks to be made. Now ThradPoolTaskScheduler is a DisposableBean an Executor is not and will not receive callbacks as such!.

How to use WatchService (NIO Package) & #Scheduled same time in single project of Spring-Boot?

I have written a script in while I have a schedular class that does something in after every certain time interval and another class in which I am watching a folder continuously for the occurance of any new file. And these both jobs (Schedular + WatchService) has to be endless.
But they are not getting called concurrently.
Called schedular class by - #Scheduled & #ComponentScan(basePackages = "com.project.schedular")
Calling WatchService by - #PostConstruct on method
Already tried Putting #PostConstruct on both and putting both packages in #ComponentScan({"com.project.schedular","com.project.watcher"})
Also tried putting #Async on both the methods.
Main Class:
#SpringBootApplication
#EnableScheduling
#ComponentScan(basePackages = "com.aprstc.schedular")
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Component
public class SchedularClass {
#PostConstruct
#Scheduled(fixedRate = 30000)
public void execute() {
//logic of scheduling method
}
Watcher Class:
#Component
public class WaybillReadScript {
#PostConstruct
public void watchFolder() throws InterruptedException, IOException {
System.out.println("Into the watch Folder.");
WatchService watchService = FileSystems.getDefault().newWatchService();
System.out.println(2);
Path path = Paths.get("/home/mypc-630/Work/abc");
System.out.println(3);
try {
path.register(watchService, StandardWatchEventKinds.ENTRY_CREATE);
} catch (IOException e) {
e.printStackTrace();
}
WatchKey key;
while ((key = watchService.take()) != null) {
for (WatchEvent<?> event : key.pollEvents()) {
if (event.context().toString().equalsIgnoreCase("wbill.txt"))
processWaybillFile();
}
key.reset();
}
}
}
I expect that both classes must run concurrently.
Watcher Must do continuous watching.
And the scheduler must do a continuous scheduled job.
I think the PostConstruct is the wrong place. PostConstruct is used to initialize your beans/components. And if you make a blocking call with watchService.take(), this PostContruct will never be left and if not all beans are completely created than your application with the scheduler will not start.

Spring Cloud Stream - First Kafka messages get error "Dispatcher has no subscribers"

My app successfully sends Kafka messages, but only after Kafka is initialized. Before that i get the error "Dispatcher has no subscribers". How do i wait for subscribers to finish being registered for channels?
Here's a trace of the order of events (timing in second.ms):
17.165 SenderClass created
17.816 initialization class, #PostConstruct starts PollingTask
24.781 PollingTask sends first Kafka message
24.816 First error: "Dispatcher has no subscribers"
25.778 Registering MessageChannel my-channel
still seeing Dispatcher errors
27.067 Channel my-channel' has 1 subscriber
No more errors after this, messages send fine
i'm not sure how to approach this. Wild guesses have included:
Place sending code in #PostConstruct
Add #AutoConfigureBefore(BindingServiceConfiguration.class) to Sender
Add #AutoConfigureAfter(BindingServiceConfiguration.class) to SenderClass
Add #AutoConfigureBefore(BindingServiceConfiguration.class) to Main
Place #DependsOn({"EnableBindingClass"}) on Task
Place #DependsOn({"ApplicationLifeCycle"}) on SchedulerClass, where ApplicationLifeCycle is a class that does nothing but
implements SmartLifecycle with getPhase returning MAX_INT
Making sure ComponentScan is on for whole package (a suggestion from other SO threads)
Various combinations of the above
Created a new app, made it as simple as i could:
public interface Source {
#Output(channelName)
MessageChannel outboundChannel();
}
#EnableBinding(Source.class)
#Component
public class Sender {
#Autowired
private Source source;
public boolean send(SomeObject object) {
return source.outboundChannel().send(MessageBuilder.withPayload(object).build());
}
#Service
public class Scheduler {
#Autowired
Sender sender;
#Autowired
ThreadPoolTaskScheduler taskScheduler;
#PostConstruct
public void initialize() {
taskScheduler.schedule(new PollingTask(), nextTime);
}
private class PollingTask implements Runnable {
#Override
public void run() {
List<SomeObject> objects = getDummyData();
for(SomeObject object : objects)
{
sender.send(interval);
}
Instant nextTime = Instant.now().plusMillis(1_000L);
try {
taskScheduler.schedule(new PollingTask(), nextTime);
} catch (Exception e) {
logger.error(e);
}
}
}
Edit to add Solution
It works now! In my scheduler that starts the things that send the messages i switched from starting things in #PostConstruct to SmartLifecycle::start().
#Service
public class Scheduler implements SmartLifecycle {
#Autowired
Sender sender;
#Autowired
ThreadPoolTaskScheduler taskScheduler;
#Override
public void start() {
taskScheduler.schedule(new PollingTask(), nextTime);
}
private class PollingTask implements Runnable {
#Override
public void run() {
List<SomeObject> objects = getDummyData();
for(SomeObject object : objects)
{
sender.send(interval);
}
Instant nextTime = Instant.now().plusMillis(1_000L);
try {
taskScheduler.schedule(new PollingTask(), nextTime);
} catch (Exception e) {
logger.error(e);
}
}
}
#PostConstruct is too early to send messages; the context is still being built.. Implememt SmartLifecycle, put the bean in a high phase (Integer.MAX_VALUE) and do the sends in start().
Or do the sends in an ApplicationRunner.
I faced a similar problem in Webflux + Spring Cloud Stream functional style. Spring Cloud Function in 2022 is the preferred way. ​
My hypothesis after a lot of debugging was that beans were not created in right order. The bean was probably not registered in spring-cloud-stream's dispatchers before kafka message processing started. similar to what #gary mentioned.
So I added #Order(1) before my consumer beans. Hoping that this bean would be created before it is dispatcher-registrations starts.
​#Bean
​#Order(1)
​public Function<Flux<Message<Pojo>>, Mono<Void>> pojoConsumer() {
This seems to fix my issue for now.

Test only one job out of many spring batch

I have two jobs. I am trying to test 1 single job.
this is what I am trying:
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Autowired
#Qualifier("jobNumber1")
private Job job;
#Test
public void test() {
try {
jobLauncherTestUtils
.getJobLauncher()
.run(job, new JobParametersBuilder()
.addString("--spring.batch.job.names", "jobNumber1")
.toJobParameters());
} catch (Exception e) {
e.printStackTrace();
}
}
But when I see logs, it is running both jobs. How do I make it test only 1 test? Thanks
I have also tried to add a Job in in JobLauncherTestUtils
#Bean
public JobLauncherTestUtils jobLauncherTestUtils() throws Exception {
return new JobLauncherTestUtils() {
#Override
#Autowired
public void setJob(#Qualifier("jobNumber1") Job job) {
super.setJob(job);
}
};
}
and do jobLauncherTestUtils.launchJob(). Still both jobs are running.
You are passing a Spring Boot parameter (--spring.batch.job.names) as a Spring Batch parameter. So Spring Boot is not aware of it and will still run both jobs. You need to either:
pass the --spring.batch.job.names=jobNumber1 to the command line you are using to test your job
or add the spring.batch.job.names=jobNumber1 in the application.properties file of your test resources
Hope this helps.

Resources