Spring-Batch: how do I return a custom Job exit STATUS from a StepListener to decide next step - spring

The issue is this: I have a Spring Batch job with multiple step. Based on step one i have to decide the next steps. Can i set status in STEP1- passTasklet based on a job parameter so that i can set the Exit status to a custom status and define it in the job definition file to go to which next step.
Example
<job id="conditionalStepLogicJob">
<step id="step1">
<tasklet ref="passTasklet"/>
<next on="BABY" to="step2a"/>
<stop on="KID" to="step2b"/>
<next on="*" to="step3"/>
</step>
<step id="step2b">
<tasklet ref="kidTasklet"/>
</step>
<step id="step2a">
<tasklet ref="babyTasklet"/>
</step>
<step id="step3">
<tasklet ref="babykidTasklet"/>
</step>
</job>
i ideally want my own EXIT STATUS to be used between steps. Can i do that? will it not break any OOTB flow? is it valid to do

They are several ways to do this.
You can use a StepExecutionListener and override the afterStep method:
#AfterStep
public ExitStatus afterStep(){
//Test condition
return new ExistStatus("CUSTOM EXIT STATUS");
}
Or use a JobExecutionDecider to choose the next step based on a result.
public class CustomDecider implements JobExecutionDecider {
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
if (/* your conditon */) {
return new FlowExecutionStatus("OK");
}
return new FlowExecutionStatus("OTHER CODE HERE");
}
}
Xml config:
<decision id="decider" decider="decider">
<next on="OK" to="step1" />
<next on="OHTER CODE HERE" to="step2" />
</decision>
<bean id="decider" class="com.xxx.CustomDecider"/>

Related

Get jobExecutionContext in xml config spring batch from before step

I am defining my MultiResourceItemReader on this way:
<bean id="multiDataItemReader" class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['filesResource']}"/>
<property name="delegate" ref="dataItemReader"/>
</bean>
How you can see I want read from the jobExecutionContext the "filesResource" value.
Note: I changed some names to keep the "code privacy". This is executing, Is somebody wants more info please tell me.
I am saving this value in my first step and I am using the reader in the second step, Should I have access to it?
I am saving it in the final lines from my step1 tasklet:
ExecutionContext jobContext = context.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
jobContext.put("filesResource", resourceString);
<batch:job id="myJob">
<batch:step id="step1" next="step2">
<batch:tasklet ref="moveFilesFromTasklet" />
</batch:step>
<batch:step id="step2">
<tasklet>
<chunk commit-interval="500"
reader="multiDataItemReader"
processor="dataItemProcessor"
writer="dataItemWriter" />
</tasklet>
</batch:step>
</batch:job>
I am not really sure what I am forgetting to get the value. The error that I am getting is:
20190714 19:49:08.120 WARN org.springframework.batch.item.file.MultiResourceItemReader [[ # ]] - No resources to read. Set strict=true if this should be an error condition.
I see nothing wrong with your config. The value of resourceString should be an array of org.springframework.core.io.Resource as this is the parameter type of the resources attribute of MultiResourceItemReader.
You can pass an array or a list of String with the absolute path to each resource and it should work. Here is a quick example:
class MyTasklet implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
List<String> resources = Arrays.asList(
"/full/path/to/resource1",
"/full/path/to/resource2");
chunkContext.getStepContext().getStepExecution().getJobExecution().getExecutionContext()
.put("filesResource", resources);
return RepeatStatus.FINISHED;
}
}

How to make Execution context in Spring batch Partitioner to run in sequence

I have a requirement where first I have to select no of MasterRecords from table and then for each MasterRecords I will have to fetch no of child rows and for each child rows process and write chunk wise.
To do this I used Partitioner in spring batch and created master and slave steps to achieve this. Now code is working fine if I dont need to run slave step in same sequence it was added to Execution context.
But my requirement is to run slave step for each execution context in same sequence it was added in partitioner. Because until I process parent record I cannot process child records.
Using partitioner slave step is not running in same sequence. Please help me how to maintain same sequence for slave step run ?????
Is there any other way to achieve this using spring batch. any help is welcomed.
<job id="EPICSDBJob" xmlns="http://www.springframework.org/schema/batch">
<!-- Create Order Master Start -->
<step id="populateNewOrdersMasterStep" allow-start-if-complete="false"
next="populateLineItemMasterStep">
<partition step="populateNewOrders" partitioner="pdcReadPartitioner">
<handler grid-size="1" task-executor="taskExecutor" />
</partition>
<batch:listeners>
<batch:listener ref="partitionerStepListner" />
</batch:listeners>
</step>
<!-- Create Order Master End -->
<listeners>
<listener ref="epicsPimsJobListner" />
</listeners>
</job>
<step id="populateNewOrders" xmlns="http://www.springframework.org/schema/batch">
<tasklet allow-start-if-complete="true">
<chunk reader="epicsDBReader" processor="epicsPimsProcessor"
writer="pimsWriter" commit-interval="10">
</chunk>
</tasklet>
<batch:listeners>
<batch:listener ref="stepJobListner" />
</batch:listeners>
</step>
<bean id="epicsDBReader" class="com.cat.epics.sf.batch.reader.EPICSDBReader" scope="step" >
<property name="sfObjName" value="#{stepExecutionContext[sfParentObjNm]}" />
<property name="readChunkCount" value="10" />
<property name="readerDao" ref="readerDao" />
<property name="configDao" ref="configDao" />
<property name="dBReaderService" ref="dBReaderService" />
</bean>
Partitioner Method:
#Override
public Map<String, ExecutionContext> partition(int arg0) {
Map<String, ExecutionContext> result = new LinkedHashMap<String, ExecutionContext>();
List<String> sfMappingObjectNames = configDao.getSFMappingObjNames();
int i=1;
for(String sfMappingObjectName: sfMappingObjectNames){
ExecutionContext value = new ExecutionContext();
value.putString("sfParentObjNm", sfMappingObjectName);
result.put("partition:"+i, value);
i++;
}
return result;
}
There isn't a way to guarantee order within Spring Batch's partitioning model. The fact that the partitions are executed in parallel means that, by definition, there will be no ordering to the records processed. I think this is a case where restructuring the job a bit may help.
If your requirement is to execute the parent then execute the children, using a driving query pattern along with the partitioning would work. You'd partition along the parent records (which it looks like you're doing), then in the worker step, you'd use the parent record to drive queries and processing for the children records. That would guarantee that the child records are processed after the master one.

Spring JdbcCursorItemReader in spring batch application

My use case is as follows:
There is an Employee table and columns are as follows:
employee_id;
empoyee_dob;
employee_lastName;
employee_firstName;
employee_zipCode
Now there is an use-case to build a list of Employees present in Dept 'A' and zipcode 11223 and also employees present in Dept B and zipcode 33445.
I have configured a spring job as follows:
<batch:job id="EmployeeDetailsJob" job-repository="EmpDaoRepository">
<batch:step id="loadEmployeeDetails" >
<batch:tasklet transaction-manager="EmployeeDAOTranManager">
<batch:chunk reader="EmpDaoJdbcCursorItemReader" writer="EmpDaoWriter" commit-interval="200" skip-limit="100">
<batch:skippable-exception-classes>
</batch:skippable-exception-classes>
</batch:chunk>
<batch:listeners>
<batch:listener ref="EmpDaoStepListener" />
</batch:listeners>
<batch:transaction-attributes isolation="DEFAULT" propagation="REQUIRED" timeout="300" />
</batch:tasklet>
</batch:step>
</batch:job>
The configuration of reader is as follows:
<bean id="EmpDaoJdbcCursorItemReader" class="EmpDaoJdbcCursorItemReader">
<property name="dataSource" ref="EmpDataSource" />
<property name="sql">
<value><![CDATA[select * from Employee where employee_id=? and employee_zipCode=? ]]>
</value>
</property>
<property name="fetchSize" value="100"></property>
<property name="rowMapper" ref="EmployeeMapper" />
</bean>
There is class EmployeeQueryCriteria which has two fields employee_id and employee_zipCode.
In on of the steps i will create an ArrayList of EmployeeQueryCriteria objects for which the data has to be fetched.
So my question is:
1.Is there a way i can pass this List to the EmpDaoJdbcCursorItemReader and it will iterate through the object and set the parameter values from the EmployeeQueryCriteria object
2.Can i loop through the step to read data for every item in the ArrayList created containing EmployeeQueryCriteria and fetch the data.
The class EmpDaoJdbcCursorIte‌​mReader:
public class EmpDaoJdbcCursorIte‌​mReader extends JdbcCursorItemReader{
#BeforeStep
public void beforeStep(StepExecution stepExecution)
{
StringBuffer sqlQuerySB= new StringBuffer(super.getSql());
sqlQuerySB.append((").append(/*I am adding a comma seperated list of employee ids*/).append(")");
super.setSql(sqlQuerySB.toString());
}
}
My Spring configurations are as follows:
Spring-batch-core 2.2.2
Spring-beans 3.2.3
Spring-context 3.2.3
Can someone please provide suggestions on how to solve this problem.
you can iterate through the steps by following code model
<decision id="testLoop" decider="iterationDecider">
<next on="CONTINUABLE" to="pqrStep" />
<end on="FINISHED" />
</decision>
<step id="pqrStep" next="xyzStep">
<tasklet ref="someTasklet" />
</step>
<step id="xyzStep" next="testLoop">
<tasklet ref="someOtherTasklet" />
</step>
and Configuration is
<bean id="iterationDecider" class="com.xyz.StepFlowController" />
Following class will handle the flow based on the condition
public class StepFlowController implements JobExecutionDecider{
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
FlowExecutionStatus status = null;
try {
if (conditionIsTrue) {
status = new FlowExecutionStatus("CONTINUABLE");
}else {
status = new FlowExecutionStatus("FINISHED");
}
} catch (Exception e) {
e.printStackTrace();
}
return status;
}

Terminate Spring Batch if validation(violation count) exceed some predefined count in Item Processor

Is there any way in Spring Batch to stop the batch from proceeding further if validation count exceeds pre-defined count.
I am currently throwing the validation exception from item processor . But it just skips the item from sending to item writer not stopping the batch .
Thanks
If you are using the fluent api (StepBuilder) you can set it as follows:
#Bean
public Step step(){
return stepBuilderFactory.get("step")
.<... , ...>chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.faultTolerant()
.skipLimit(10)
.build();
}
This will instantiate as LimitCheckingItemSkipPoliciy.
You can also instantiate or define your own SkipPolicy utilizing the skipPolicy method
...
.faultTolerant()
.skipPolicy(mySkipPolicy)
...
If you want to set the skiplimit in an XML configuration:
<step id="step1">
<tasklet>
<chunk reader="flatFileItemReader" writer="itemWriter"
commit-interval="10"
skip-limit="10">
<skippable-exception-classes>
<include class="..."/>
</skippable-exception-classes>
</chunk>
</tasklet>
</step>

how to control repeat and next step execution in spring batch

Hi I have below like xml for executing Job
<batch:job id="Job1" restartable="false" xmlns="http://www.springframework.org/schema/batch">
<step id="step1" next="step2">
<tasklet ref="automate" />
</step>
<step id="step2">
<tasklet ref="drive" />
<next on="COMPLETED" to="step3"></next>
</step>
<step id="step3">
<tasklet ref="generate_file" />
</step>
</batch:job>
For this I have write a tasklet to execute a script. Now I want that if script execution failed three times then next step will not execute . But from Tasklet I am able to return only Finished which move the flow to next step and continuable which continue the process. What should I do in this.
you can write your own decider to decide wheather to goto next step or to end the job.
if you are able to handle the failures you can also handle the flow of a job
<decision id="validationDecision" decider="validationDecider">
<next on="FAILED" to="abcStep" />
<next on="COMPLETE" to="xyzstep" />
</decision>
config is
<bean id="validationDecider" class="com.xyz.StepFlowController" />
class is
public class StepFlowController implements JobExecutionDecider{
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
FlowExecutionStatus status = null;
try {
if (failure) {
status = new FlowExecutionStatus("FAILED");
}else {
status = new FlowExecutionStatus("COMPLETE");
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return status;
}
This can be achieved by specifying a custom "chunk-completion-policy" in that step and count the number of failures. Take a look at "Stopping a Job Manually for Business Reasons" and this example for custom chunk completion policy. Hope this helps.
EDIT: you can put the number of failures in step execution context like in your step logic and then retrieve it in your completion policy class:
stepExecution.getJobExecution().getExecutionContext().put("ERROR_COUNT", noOfErrors);

Resources