Spring Batch Process Questions - spring

2 questions on spring batch , Can someone please shed more light on this.
1) I have implemented registerShutdownHook in my spring batch project, but when I do kill my batch process it is not stopping immediately. It is waiting till the entire batch process is completed. Thats how it works?
public static void main(String[] args) {
final AbstractApplicationContext appContext = new AnnotationConfigApplicationContext("\"com.lexisnexis.batch\",\"com.lexisnexis.rules\"");
appContext.registerShutdownHook();
...
}
Does it needs to stop all running batches and when we do kill with this registerShutdownHook code?
2) What is the best way to restart all stopped jobs?

Yes, that's how it works. It's needed to close spring application context gracefully. Please check spring framework's documentation.
To restart your stopped job you need to invoke jobOperator.restart(executionId) or use jobLauncher.run with same parameters you used to start original job.

Related

How to restart springboot application from code

I have a springboot application with embedded tomcat. And on certain cases it should be restarted from code.
I have read several articles and SO posts regarding this,but yet to find a clean solution.
I am aware that 'context.close' , 'SpringApplication.exit(context)' exist and can be wrapped into something like this:
public static void restart() {
ApplicationArguments args = context.getBean(ApplicationArguments.class);
Thread thread = new Thread(() -> {
context.close();
context = SpringApplication.run(Application.class, args.getSourceArgs());
});
thread.setDaemon(false);
thread.start();
}
source: https://www.baeldung.com/java-restart-spring-boot-app
The problem is that using context.close() just doesn't work in a clean way. The context itself will be restarted though, but bunch of Threads will be left in the background (like Thread[pool-3-thread-1,5,main] Thread[Signal Dispatcher,9,system] Thread[OkHttp TaskRunner,5,main] ..etc).
And for every context restart these will be recreated, so the number of threads adds up gradually by each restart. Resulting in huge Thread mess as time passes.
Note1: A simple application exit by using 'context.close()' also wouldn't work because of these left over Threads. So the context close doesnt even close the application.
Note2: If I use System.exit(SpringApplication.exit(context)) I can kill the app gracefully, but can't restart it.
Note3: I don't want to use neither devtools nor actuator
So the question is how to perform a total restart for a springboot application?
You can use the RestartEndPoint in spring-cloud-context dependency to restart the Spring Boot application programmatically:
#Autowired
private RestartEndpoint restartEndpoint;
...
Thread restartThread = new Thread(() -> restartEndpoint.restart());
restartThread.setDaemon(false);
restartThread.start();

Running scheduler in Spring boot is spawning a process external to Spring boot application context

I am scheduling a task that runs at fixed rate in Spring boot. The function that I am using to schedule a a task is as below:
private void scheduleTask(Store store, int frequency) {
final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
Runnable task = store::scan;
scheduler.scheduleAtFixedRate(task, 0, frequency, TimeUnit.MILLISECONDS);
}
This works fine but if if there is an exception at application startup, the application should exit on exception. What is happening is that I get the exception in the log and the message "Application Failed to start" but looks like the scheduler shows as still running although it looks like only the scheduled thread is still running.
Any hints on how to properly schedule an asynchronous task in a Spring boot application? I tried the #Scheduled annotation but it does not run at all.
The #Scheduled should work. Have you added the #EnabledScheduling annotation to a #Configuration or the #SpringBootApplication? The Scheduling Getting Started explains it in detail.
Regarding the scheduleTask method: What calls that? Is it started outside the Spring context? If yes then Spring won't stop it. You have to take care of the lifecycle.
You should try to use the #Scheduled as it will manage the thread pools/executors for you and most people will find it easier to understand.

Trigger spring batch job automatically

Is there any option to run Spring Batch job automatically without scheduling or trigger it from other source?
We can run spring batch job by scheduling expression or by main method (As mentioned below).
public static void runBatch() {
JobLauncher jobLauncher = (JobLauncher) getApplicationContext().getBean("jobLauncher");
Job job = get Job details here;
JobParametersBuilder jobParameters = new JobParametersBuilder();
// Setting Job Parameter and run job using below
JobExecution jobExecution = jobLauncher.run(job, jobParameters.toJobParameters());
}
public static void main(String[] args) {
runBatch();
}
Means we need to call main method manually or by some other scheduler so that it will trigger. Without this main method or scheduler can we call this batch process automatically? Or Is there any better options?
Spring batch should trigger automatically without triggering from any entry point its like daemon thread.
Suppose once batch is processing data using spring batch. How can automatically trigger spring batch job after completion of running
job?
Many options exist to trigger the batch:
To call a batch from another scheduler, issue run parameters there and trigger the Job.
If it has to be triggered automatically, say may be on certain interval, use fixedDelay scheduling
To trigger manually, you can go for MBean approach where it can be triggered from JConsole.
Or have an endpoint to call runBatch.

Where does Spring Batch saves the batch execution state?

I have a spring batch job that takes a long time to execute. After executing for a while I decided I wanted to stop it but whenever I restart the server the job still continues executing after the server come back up.
I want to know where spring batch saves the state so that I can possibly delete it and stop that from happening.
I found out there are properties that I can configure to not have the job restartable and I will use that going forward but now I just need to make sure the job can stop for good.
You can see the documentation here which shows & describes the spring Batch Meta-Data tables.
if you just want to prevent restart of job below config should help you.
preventRestart() helps to avoid restart of job
https://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/job/builder/JobBuilderHelper.html
#Bean
public Job myJob(JobBuilderFactory jobs) throws Exception {
return jobs.get("myJob")
.flow(step1())
.build()
.preventRestart()
.build();
}

using Spring integration with spring batch

I have a spring batch application which reads from file , does some processing and finally write a customized output. This all happens in one step. In next step i have a tasklet which archives the input files (move to another folder). This application works fine.But, now i got a requirement to sftp output files on a remote servers where they would further processed. I got a way to sftp using spring integration where i have created a input channel which feeds to outboundchannel adapter. I put my files as payload in message and send messages to channel. The only problem i see here is that everytime I have to get the context i eed to load the spring config file, which seems kind of hackish way to do the task. Does anyone know about any way to integrate SI with SB.
Let me know if you want to see my config...
Thanks in Advance !!
code to access the same app-context without loading the spring config again
public class AppContextProvider implements ApplicationContextAware{
private static ApplicationContext ctx;
public ApplicationContext getApplicationContext() {
return ctx;
}
public void setApplicationContext(ApplicationContext appContext) throws BeansException {
ctx = appContext;
}
}
code to push the output file to sftp server
log.info("Starting transfer of outputFile : " + absoluteOutputFileName);
final File file = new File(absoluteOutputFileName);
final Message<File> message = MessageBuilder.withPayload(file).build();
AppContextProvider context = new AppContextProvider();
final MessageChannel inputChannel = context.getApplicationContext().getBean("toChannel",MessageChannel.class);
inputChannel.send(message);
log.info("transfer complete for : " + absoluteOutputFileName);
Take a look at the spring-batch-integration module within the Spring Batch project. In there, we have components for launching jobs via messages. In your situation, you'd FTP the file down then have the JobLaunchingMessageHandler launch the job.
You can also watch this video of a talk I co-presented at SpringOne a couple years ago on this topic: https://www.youtube.com/watch?v=8tiqeV07XlI
As Michael said, you'll definitely want to look at and leverage spring-batch-integration. We actually use Spring Integration as a wrapper of sorts to launch 100% of our Spring Batch jobs.
One use case we've found particularly useful is leveraging the spring-integration-file Inbound Channel Adapters to poll staging directories to indicate when a new batch file has landed. As the poller finds a new file, we then launch a new batch job using the input filename as a parameter.
This has been a real help when it comes to restartability, because we now have one job instance per file as opposed to having a job kick off at arbitrary intervals and then partition across however many files happen to be in the staging folder. Now if an exception occurs during processing, you can target a specific job for restart immediately rather than waiting for 99 of the 100 "good" files to finish first.

Resources