How can I stop my whole job from parallel flow? - spring

I have a job with 2 flows
Flow parallelFlow = new FlowBuilder<SimpleFlow>("parallelFlow")
.split(new SimpleAsyncTaskExecutor())
.add(flow1, flow2)
.build();
flow1 contains step1 which stops the job using stepExecution.setTerminateOnly() after 5s
public class Step1 implements Tasklet {
#Override
public RepeatStatus execute(#NotNull StepContribution stepContribution, #NotNull ChunkContext chunkContext) throws InterruptedException, NoSuchJobExecutionException, JobExecutionNotRunningException {
Thread.sleep(5000);
stepContribution.getStepExecution().setTerminateOnly();
return RepeatStatus.FINISHED;
}
}
flow2 contains step2 which only waits 10s and step3 with some logic
#AllArgsConstructor
public class Step2 implements Tasklet {
#Override
public RepeatStatus execute(#NotNull StepContribution stepContribution,#NotNull ChunkContext chunkContext) throws InterruptedException {
Thread.sleep(10000);
return RepeatStatus.FINISHED;
}
}
I tried to stop the job and I expect the job was stopped and step3 was never executed.
In this case my job was stopped but step3 was executed.
But if I change flows order to flow2, flow1 it will work like a charm!
Flow parallelFlow = new FlowBuilder<SimpleFlow>("parallelFlow")
.split(new SimpleAsyncTaskExecutor())
.add(flow2, flow1)
.build();
I cannot understand what I'm doing wrong
Java version 11, Spring batch version 4.2.4.RELEASE

Related

How to stop service in spring batch

Follow the title, i have a service running in backend using spring batch.
My service:
#Service
publlic class TestBatch {
public void testDelay(String jobID) {
// TODO Auto-generated method stub
try {
for(int i=0; i< 1000; i++) {
Thread.sleep(1000);
System.out.println(jobID + " is running");
}
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
My tasklet:
public class TestTasklet implement Tasklet {
#Resource
private TestBatch testBatch ;
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
testBatch.testDelay("Test01"); // Param to show in cololog
return RepeatStatus.FINISHED;
}
}
I'm try to stop job:
#Service
public class JobService {
#Autowired
private SimpleJobOperator simpleJobOperator;
#Autowired
private JobExplorer jobs;
public void stopJob(String jobid) {
simpleJobOperator.stop(jobid);// Using job operator
JobExecution jobExecution = jobs.getJobExecution(jobid); // Using job execution
jobExecution.stop();
}
}
Job is stop, but in my console still output text:
Test01 is running
Test01 is running
Test01 is running
...
I don't know how to stop TestBatch - method testDelay() when job stop. How can i do it?
You need to use StoppableTasklet and not Tasklet. Spring Batch will call StoppableTasklet#stop when the job is requested to stop through the JobOperator.
However, it is up to you to make sure your code stops correctly, here is an excerpt from the Javadoc:
It is up to each implementation as to how the stop will behave.
The only guarantee provided by the framework is that a call to JobOperator.stop(long)
will attempt to call the stop method on any currently running StoppableTasklet.

how to get Spring Batch job instance id from execute method in TASKLET

I am using a layout using Spring Batch 3.0 version.
Create a Job and execute the placement by executing the JobLauncher run method of the TASKLET.
I want to know more accurately whether the Job is executed or not through insert logic in the query in TASKLET with the corresponding JobId and other tables other than the metatables.
public class SampleScheduler {
protected final Logger log = LoggerFactory.getLogger(this.getClass());
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job sampleJob;
public void run() {
try {
String dateParam = new Date().toString();
JobParameters param = new JobParametersBuilder().addString("date",dateParam).toJobParameters();
JobExecution execution = jobLauncher.run(sampleJob, param);
log.debug("###################################################################");
log.debug("Exit Status : " + execution.getStatus());
log.debug("###################################################################");
} catch (Exception e) {
// e.printStackTrace();
log.error(e.toString());
}
}
}
Code for calling tasklet -
public class SampleTasklet implements Tasklet{
#Autowired
private SampleService sampleService;
#Override
public RepeatStatus execute(StepContribution contribution,
ChunkContext chunkContext) throws Exception {
sampleService.query();
return RepeatStatus.FINISHED;
}
}
This is my tasklet code.
StepContext stepContext = chunkContext.getStepContext();
StepExecution stepExecution = stepContext.getStepExecution();
JobExecution jobExecution = stepExecution.getJobExecution();
long jobInstanceId = jobExecution.getJobId();
Is it right to try this in the TASKLET code above?
how to get Spring Batch job instance id from execute method in TASKLET
The org.springframework.batch.core.step.tasklet.Tasklet#execute method gives you access to the ChunkContext which in turn allows you to get the parent StepExecution and JobExecution. You can then get the job instance id from the job execution.
Is it right to try this in the TASKLET code above?
Yes, that's the way to go.

Spring Batch Job taking previous execution parameters

I am using spring cloud dataflow and have created a spring cloud task which contains a job. This job has a parameter called last_modified_date, which is optional. In the code, I have specified which date to take in case last_modified_date is null, that is, it has not been passed as a parameter. The issue is that if for one instance of the job I pass the last_modified_date but for the next one I don't, it picks up the one in the last execution rather than passing it as null and getting it from the code.
#Component
#StepScope
public class SalesforceAdvertiserLoadTasklet implements Tasklet {
#Value("#{jobParameters['last_modified_date']}")
protected Date lastModifiedDate;
private static final Logger logger =
LoggerFactory.getLogger(SalesforceAdvertiserLoadTasklet.class);
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext)
throws Exception {
if(lastModifiedDate == null) {
lastModifiedDate =
Date.from(LocalDate.now().minusDays(1).atStartOfDay(ZoneId.systemDefault()).toInstant());
}
logger.info("In Method: runSalesforceAdvertiserLoadJob launch started on last_modified_date {}",
lastModifiedDate);
logger.info("Getting advertisers from SalesForce");
try {
getAdvertisersFromSalesforceAndAddtoDb();
} catch (JsonSyntaxException | IOException | ParseException e) {
logger.error("ERROR--> {}", e.getMessage());
}
return RepeatStatus.FINISHED;
}
#Bean
public JobParametersIncrementer runIdIncrementor() {
return new RunIdIncrementer();
}
#Bean
public Job salesforceAdvertiserLoadJob() {
return jobBuilderFactory.get(SalesforceJobName.salesforceAdvertiserLoadJob.name())
.incrementer(runIdIncrementor())
.listener(batchJobsExecutionListener)
.start(stepsConfiguration.salesforceAdvertiserLoadStep()).build();
}
Is there a way I can stop the new job instance from taking parameters from the previous job instance?
I think that you didn't provide JobParametersIncrementer to your JobBuilder. Example:
Job job = jobBuilderFactory.get(jobName)
.incrementer(new RunIdIncrementer())
.start(step)
.end()
.build();

method annotation with #BeforeStep not getting called

my goal is passing some value from a tasklet to another step which is consist of itemreader, processor and writer. based on the reference, i should use ExecutionContextPromotionListener. however, for some reason #BeforeStep is not being called. this is what I have.
Tasklet
#Component
public class RequestTasklet implements Tasklet {
#Autowired
private HistoryRepository historyRepository;
#Override
public RepeatStatus execute(StepContribution sc, ChunkContext cc) throws Exception {
List<History> requests
= historyRepository.findHistory();
ExecutionContext stepContext = cc.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
stepContext.put("someKey", requests);
return RepeatStatus.FINISHED;
}
}
ItemReader
#Component
public class RequestReader implements ItemReader<History> {
private List<History> requests;
#Override
public History read() throws UnexpectedInputException,
ParseException,
NonTransientResourceException {
System.out.println("requests====>" + requests);
if (CollectionUtils.isNotEmpty(requests)) {
History request = requests.remove(0);
return request;
}
return null;
}
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
System.out.println("====================here======================");
JobExecution jobExecution = stepExecution.getJobExecution();
ExecutionContext jobContext = jobExecution.getExecutionContext();
this.requests = (List<History>) jobContext.get("someKey");
}
}
Configuration
#Bean(name = "update-job")
public Job updateUserAttributes(
JobBuilderFactory jbf,
StepBuilderFactory sbf,
ExecutionContextPromotionListener promoListener,
HistoryProcessor processor,
RequestReader reader,
HistoryWriter writer,
RequestTasklet loadRequestTasklet) {
Step preStep = sbf.get("load-request-tasklet")
.tasklet(loadRequestTasklet)
.listener(promoListener)
.build();
Step step = sbf.get("update-step")
.<History, History>chunk(2)
.reader(reader)
.processor(processor)
.writer(writer)
.taskExecutor(taskExecutor())
.build();
return jbf.get("update-job")
.incrementer(new RunIdIncrementer())
.start(preStep).next(step).
build();
}
#Bean
public ExecutionContextPromotionListener promoListener() {
ExecutionContextPromotionListener listener = new ExecutionContextPromotionListener();
listener.setKeys(new String[]{
"someKey",
});
return listener;
}
I also tried to extend StepExecutionListenerSupport in ItemReader but got the same result.
I googled around and looked at this question answered due to spring proxy which is not my case beforestep issue.
little more information as i am testing. I set listener.setStrict(Boolean.TRUE); to see if keys are set. but i am getting
java.lang.IllegalArgumentException: The key [someKey] was not found in the Step's ExecutionContext.
at org.springframework.batch.core.listener.ExecutionContextPromotionListener.afterStep(ExecutionContextPromotionListener.java:61) ~[spring-batch-core-4.0.0.M1.jar:4.0.0.M1]
appreciate any help.

Spring batch execute dynamically generated steps in a tasklet

I have a spring batch job that does the following...
Step 1. Creates a list of objects that need to be processed
Step 2. Creates a list of steps depending on how many items are in the list of objects created in step 1.
Step 3. Tries to executes the steps from the list of steps created in step 2.
The executing x steps is done below in executeDynamicStepsTasklet(). While the code runs without any errors it does not seem to be doing anything. Does what I have in that method look correct?
thanks
/*
*
*/
#Configuration
public class ExportMasterListCsvJobConfig {
public static final String JOB_NAME = "exportMasterListCsv";
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Value("${exportMasterListCsv.generateMasterListRows.chunkSize}")
public int chunkSize;
#Value("${exportMasterListCsv.generateMasterListRows.masterListSql}")
public String masterListSql;
#Autowired
public DataSource onlineStagingDb;
#Value("${out.dir}")
public String outDir;
#Value("${exportMasterListCsv.generatePromoStartDateEndDateGroupings.promoStartDateEndDateSql}")
private String promoStartDateEndDateSql;
private List<DivisionIdPromoCompStartDtEndDtGrouping> divisionIdPromoCompStartDtEndDtGrouping;
private List<Step> dynamicSteps = Collections.synchronizedList(new ArrayList<Step>()) ;
#Bean
public Job exportMasterListCsvJob(
#Qualifier("createJobDatesStep") Step createJobDatesStep,
#Qualifier("createDynamicStepsStep") Step createDynamicStepsStep,
#Qualifier("executeDynamicStepsStep") Step executeDynamicStepsStep) {
return jobBuilderFactory.get(JOB_NAME)
.flow(createJobDatesStep)
.next(createDynamicStepsStep)
.next(executeDynamicStepsStep)
.end().build();
}
#Bean
public Step executeDynamicStepsStep(
#Qualifier("executeDynamicStepsTasklet") Tasklet executeDynamicStepsTasklet) {
return stepBuilderFactory
.get("executeDynamicStepsStep")
.tasklet(executeDynamicStepsTasklet)
.build();
}
#Bean
public Tasklet executeDynamicStepsTasklet() {
return new Tasklet() {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
FlowStep flowStep = new FlowStep(createParallelFlow());
SimpleJobBuilder jobBuilder = jobBuilderFactory.get("myNewJob").start(flowStep);
return RepeatStatus.FINISHED;
}
};
}
public Flow createParallelFlow() {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(1);
List<Flow> flows = dynamicSteps.stream()
.map(step -> new FlowBuilder<Flow>("flow_" + step.getName()).start(step).build())
.collect(Collectors.toList());
return new FlowBuilder<SimpleFlow>("parallelStepsFlow")
.split(taskExecutor)
.add(flows.toArray(new Flow[flows.size()]))
.build();
}
#Bean
public Step createDynamicStepsStep(
#Qualifier("createDynamicStepsTasklet") Tasklet createDynamicStepsTasklet) {
return stepBuilderFactory
.get("createDynamicStepsStep")
.tasklet(createDynamicStepsTasklet)
.build();
}
#Bean
#JobScope
public Tasklet createDynamicStepsTasklet() {
return new Tasklet() {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
for (DivisionIdPromoCompStartDtEndDtGrouping grp: divisionIdPromoCompStartDtEndDtGrouping){
System.err.println("grp: " + grp);
String stepName = "stp_" + grp;
String fileName = grp + FlatFileConstants.EXTENSION_CSV;
Step dynamicStep =
stepBuilderFactory.get(stepName)
.<MasterList,MasterList> chunk(10)
.reader(queryStagingDbReader(
grp.getDivisionId(),
grp.getRpmPromoCompDetailStartDate(),
grp.getRpmPromoCompDetailEndDate()))
.writer(masterListFileWriter(fileName))
.build();
dynamicSteps.add(dynamicStep);
}
System.err.println("createDynamicStepsTasklet dynamicSteps: " + dynamicSteps);
return RepeatStatus.FINISHED;
}
};
}
public FlatFileItemWriter<MasterList> masterListFileWriter(String fileName) {
FlatFileItemWriter<MasterList> writer = new FlatFileItemWriter<>();
writer.setResource(new FileSystemResource(new File(outDir, fileName )));
writer.setHeaderCallback(masterListFlatFileHeaderCallback());
writer.setLineAggregator(masterListFormatterLineAggregator());
return writer;
}
So now I have a list of dynamic steps that need to be executed and I believe that they are in StepScope. Can someone advise me on how to execute them
This will not work. Your Tasklet just creates a job with a FlowStep as first Step. Using the jobBuilderfactory just creates the job. it does not launch it. The methodname "start" may be misleading, since this only defines the first step. But it does not launch the job.
You cannot change the structure of a job (its steps and substeps) once it is started. Therefore, it is not possible to configure a flowstep in step 2 based on things that are calculated in step 1. (of course you could do some hacking deeper inside the springbatch structure and directly modify the beans and so ... but you don't want to do that).
I suggest, that you use a kind of "SetupBean" with an appropriate postConstruct method which is injected into your class that configures your job. This "SetupBean" is responsible to calculate the list of objects being processed.
#Component
public class SetUpBean {
private List<Object> myObjects;
#PostConstruct
public afterPropertiesSet() {
myObjects = ...;
}
public List<Object> getMyObjects() {
return myObjects;
}
}
#Configuration
public class JobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private SetUpBean setup;
...
}

Resources