Get jobExecutionContext in xml config spring batch from before step - spring

I am defining my MultiResourceItemReader on this way:
<bean id="multiDataItemReader" class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['filesResource']}"/>
<property name="delegate" ref="dataItemReader"/>
</bean>
How you can see I want read from the jobExecutionContext the "filesResource" value.
Note: I changed some names to keep the "code privacy". This is executing, Is somebody wants more info please tell me.
I am saving this value in my first step and I am using the reader in the second step, Should I have access to it?
I am saving it in the final lines from my step1 tasklet:
ExecutionContext jobContext = context.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
jobContext.put("filesResource", resourceString);
<batch:job id="myJob">
<batch:step id="step1" next="step2">
<batch:tasklet ref="moveFilesFromTasklet" />
</batch:step>
<batch:step id="step2">
<tasklet>
<chunk commit-interval="500"
reader="multiDataItemReader"
processor="dataItemProcessor"
writer="dataItemWriter" />
</tasklet>
</batch:step>
</batch:job>
I am not really sure what I am forgetting to get the value. The error that I am getting is:
20190714 19:49:08.120 WARN org.springframework.batch.item.file.MultiResourceItemReader [[ # ]] - No resources to read. Set strict=true if this should be an error condition.

I see nothing wrong with your config. The value of resourceString should be an array of org.springframework.core.io.Resource as this is the parameter type of the resources attribute of MultiResourceItemReader.
You can pass an array or a list of String with the absolute path to each resource and it should work. Here is a quick example:
class MyTasklet implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
List<String> resources = Arrays.asList(
"/full/path/to/resource1",
"/full/path/to/resource2");
chunkContext.getStepContext().getStepExecution().getJobExecution().getExecutionContext()
.put("filesResource", resources);
return RepeatStatus.FINISHED;
}
}

Related

How to skip reader, writer in spring batch

I have a requirement where I need to upload some files to a server. I am using spring batch to accomplish the same. Here the "initializeFile" will basically interact with the server to check if the files already exists in server. if not then it should call the step "uploadIndexFileStep" to upload the files. If files already present in server then the step "uploadIndexFileStep" SHOULDN'T be called.
How to implement this case wherein if the "initializeFile" has no files to upload then spring should not call the next step "uploadIndexFileStep".
Is there a way, or do I need to follow some design or its a spring config change? Any pointers would be helpful.
following is the batch configuration.
<batch:step id="initFileStep" next="uploadIndexFileStep">
<batch:tasklet ref="initializeFile"></batch:tasklet>
</batch:step>
<batch:step id="uploadIndexFileStep">
<batch:tasklet>
<batch:chunk reader="indexFileReader" processor="indexFileProcessor" writer="indexFileWriter" commit-interval="${app.chunk.commit.interval}"/>
</batch:tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="uploadIndexJobListener"/>
</batch:listeners>
</batch:job>
Spring batch provides a nice way to handle conditional flow. You can implement this by using ON exist status.
You can have something like below
#Bean
public Job job() {
return jobBuilderFactory().get("job").
flow(initializeFile()).on("FILELOADED").to(anyStep()).
from(initializeFile()).on("FILENOTLOADED").to(uploadIndexFileStep()).next(step3()).next(step4()).end().build();
}
5.3.2 Conditional Flow
I resolved this using JobExecutionDecider. I am maintaining the queue size in ExecutionContext and then reading this execution context in decider to manage the flow.
public class UploadIndexFlowDecider implements JobExecutionDecider {
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
int queueSize = jobExecution.getExecutionContext().getInt("INDEX_UPLOAD_QUEUE_SIZE");
if(queueSize > 0)
return FlowExecutionStatus.COMPLETED;
else
return FlowExecutionStatus.STOPPED;
}
}
#Component
public class InitializeFileStep implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
chunkContext.getStepContext().getStepExecution().getJobExecution().getExecutionContext().putInt("INDEX_UPLOAD_QUEUE_SIZE", 1);
return RepeatStatus.FINISHED;
}
<batch:job id="uploadIndexFileJob">
<batch:step id="initFileStep" next="uploadDecision">
<batch:tasklet ref="initializeFile"></batch:tasklet>
</batch:step>
<batch:decision id="uploadDecision" decider="uploadIndexDecision">
<batch:next on="COMPLETED" to="uploadIndexFileStep"/>
<batch:end on="STOPPED"/>
</batch:decision>
<batch:step id="uploadIndexFileStep">
<batch:tasklet>
<batch:chunk reader="indexFileReader" processor="indexFileProcessor" writer="indexFileWriter" commit-interval="${app.chunk.commit.interval}"/>
</batch:tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="uploadIndexJobListener"/>
</batch:listeners>
</batch:job>

Spring JdbcCursorItemReader in spring batch application

My use case is as follows:
There is an Employee table and columns are as follows:
employee_id;
empoyee_dob;
employee_lastName;
employee_firstName;
employee_zipCode
Now there is an use-case to build a list of Employees present in Dept 'A' and zipcode 11223 and also employees present in Dept B and zipcode 33445.
I have configured a spring job as follows:
<batch:job id="EmployeeDetailsJob" job-repository="EmpDaoRepository">
<batch:step id="loadEmployeeDetails" >
<batch:tasklet transaction-manager="EmployeeDAOTranManager">
<batch:chunk reader="EmpDaoJdbcCursorItemReader" writer="EmpDaoWriter" commit-interval="200" skip-limit="100">
<batch:skippable-exception-classes>
</batch:skippable-exception-classes>
</batch:chunk>
<batch:listeners>
<batch:listener ref="EmpDaoStepListener" />
</batch:listeners>
<batch:transaction-attributes isolation="DEFAULT" propagation="REQUIRED" timeout="300" />
</batch:tasklet>
</batch:step>
</batch:job>
The configuration of reader is as follows:
<bean id="EmpDaoJdbcCursorItemReader" class="EmpDaoJdbcCursorItemReader">
<property name="dataSource" ref="EmpDataSource" />
<property name="sql">
<value><![CDATA[select * from Employee where employee_id=? and employee_zipCode=? ]]>
</value>
</property>
<property name="fetchSize" value="100"></property>
<property name="rowMapper" ref="EmployeeMapper" />
</bean>
There is class EmployeeQueryCriteria which has two fields employee_id and employee_zipCode.
In on of the steps i will create an ArrayList of EmployeeQueryCriteria objects for which the data has to be fetched.
So my question is:
1.Is there a way i can pass this List to the EmpDaoJdbcCursorItemReader and it will iterate through the object and set the parameter values from the EmployeeQueryCriteria object
2.Can i loop through the step to read data for every item in the ArrayList created containing EmployeeQueryCriteria and fetch the data.
The class EmpDaoJdbcCursorIte‌​mReader:
public class EmpDaoJdbcCursorIte‌​mReader extends JdbcCursorItemReader{
#BeforeStep
public void beforeStep(StepExecution stepExecution)
{
StringBuffer sqlQuerySB= new StringBuffer(super.getSql());
sqlQuerySB.append((").append(/*I am adding a comma seperated list of employee ids*/).append(")");
super.setSql(sqlQuerySB.toString());
}
}
My Spring configurations are as follows:
Spring-batch-core 2.2.2
Spring-beans 3.2.3
Spring-context 3.2.3
Can someone please provide suggestions on how to solve this problem.
you can iterate through the steps by following code model
<decision id="testLoop" decider="iterationDecider">
<next on="CONTINUABLE" to="pqrStep" />
<end on="FINISHED" />
</decision>
<step id="pqrStep" next="xyzStep">
<tasklet ref="someTasklet" />
</step>
<step id="xyzStep" next="testLoop">
<tasklet ref="someOtherTasklet" />
</step>
and Configuration is
<bean id="iterationDecider" class="com.xyz.StepFlowController" />
Following class will handle the flow based on the condition
public class StepFlowController implements JobExecutionDecider{
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
FlowExecutionStatus status = null;
try {
if (conditionIsTrue) {
status = new FlowExecutionStatus("CONTINUABLE");
}else {
status = new FlowExecutionStatus("FINISHED");
}
} catch (Exception e) {
e.printStackTrace();
}
return status;
}

How to call a stored procedure from a Spring Batch Tasklet?

It is mentioned that a TaskletStep in Spring Batch can be used to call a stored procedure. Could anyone provide an example of how to invoke a Stored Procedure from a TaskletStep? So far I have done this but it throws an exception saying "Configuration problem: The element [callStoredProcedure] is unreachable"
<job id="job1">
<step id="step1">
<tasklet ref="myTasklet"/>
</step>
</job>
<bean id="myTasklet" class="MyClass">
<property name="dataSource" ref="dataSource"/>
<property name="sql" value="call stored_procedure()"/>
</bean>
Java Class
class MyClass implements Tasklet{
#Override
public RepeatStatus execute(StepContribution contribution,
ChunkContext chunkContext) throws Exception {
JdbcTemplate myJDBC=new JdbcTemplate(getDataSource());
myJDBC.execute(sql);
return RepeatStatus.FINISHED;
}
}
How and where should the stored procedure be configured? Would be grateful to receive any pointers?
Instead of
value="call stored_procedure()"
just put
value="stored_procedure"
without () on end. That should resolve your issue

How to write a spring batch step without an itemwriter

I am trying to configure a spring batch step without an item writer using below configuraion. However i get error saying that writer
element has neither a 'writer' attribute nor a element.
I went through the link spring batch : Tasklet without ItemWriter. But could not resolve issue.
Could any one tell me the specific changes to be made in the code snippet I mentioned
<batch:job id="helloWorldJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="cvsFileItemReader"
commit-interval="10">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="cvsFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="classpath:cvs/input/report.csv" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="id,sales,qty,staffName,date" />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="com.mkyong.ReportFieldSetMapper" />
<!-- if no data type conversion, use BeanWrapperFieldSetMapper to map by name
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="report" />
</bean>
-->
</property>
</bean>
</property>
</bean>
For chunk-based step reader and writer are mandatory.
If you don't want a writer use a No-operation ItemWriter that does nothing.
EDIT:
A no-op implementation is an empty implementation of interface tha does...nothing!
Just let your class implements desiderable inteface(s) with empty methods.
No-op ItemWriter:
public class NoOpItemWriter implements ItemWriter {
void write(java.util.List<? extends T> items) throws java.lang.Exception {
// no-op
}
}
I hope you got answer but I want to explain it for other readers, When we use chunk then usually we declare reader, processor and writer. In chunk reader and writer are mandatory and processor is optional. In your case if you don't need writer then u need to make a class which implements ItemWriter. Override write method and keep it blank. Now create a bean of writer class and pass it as reference of writer.
<batch:step id="recordProcessingStep" >
<batch:tasklet>
<batch:chunk reader="fileReader" processor="recordProcessor"
writer="rocordWriter" commit-interval="1" />
</batch:tasklet>
</batch:step>
Your writer class will look like .
public class RecordWriter<T> implements ItemWriter<T> {
#Override
public void write(List<? extends T> items) throws Exception {
// TODO Auto-generated method stub
}
}
In maven repo you can find the framework "spring-batch-samples".
In this framework you will find this Writer :
org.springframework.batch.sample.support.DummyItemWriter

Mockito Spy SystemCommandTasklet in a Spring Batch Step

I am re-factoring a troublesome test I have inherited for a Spring Batch job that calls an external script to perform a task. The original version of the test substituted the real script with a simple move file script and started the job, then called that script and tested that a file was moved correctly (which served to verify that the script was called with the correct parameters).
My goal for the new version is to eliminate the need to call a real script for this test, instead stubbing and verifying the execution method of the tasklet being used to call the script, relying on Mockito's verify to ensure that the correct parameters are used.
The configuration the job is as follows:
<flow id="job-flow">
<step id="preprocess" parent="preprocess-base" next="process">
<tasklet>
<beans:bean class="com.company.project.main.package.PreprocessTasklet">
<beans:property name="doMove" value="false" />
</beans:bean>
</tasklet>
</step>
<step id="process" next="postprocess">
<tasklet ref="commandTasklet" />
</step>
<step id="postprocess" parent="postprocess-base" />
</flow>
<bean id="commandTasklet" class="org.springframework.batch.core.step.tasklet.SystemCommandTasklet" scope="step">
<property name="command" value="${a.dir}/job_script.sh" />
<property name="environmentParams" value="working_dir=#{jobExecutionContext['job.dir']}" />
<property name="workingDirectory" value="${b.dir}" />
<property name="timeout" value="3600000"/>
</bean>
<batch:job id="run-my-script" parent="base-job" incrementer="defaultIncrementer">
<batch:flow id="script-job" parent="job-flow" />
</batch:job>
In order to prevent commandTasklet from invoking the actual shell script, I use a BeanPostProcessor to replace it with a spy and stub the execute method.
public class SpiedCommandTaskletPostProcessor implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (SystemCommandTasklet.class.isAssignableFrom(bean.getClass())) {
SystemCommandTasklet spiedTasklet = Mockito.spy((SystemCommandTasklet) bean);
try {
Mockito.doReturn(RepeatStatus.FINISHED).when(spiedTasklet)
.execute(Mockito.any(StepContribution.class), Mockito.any(ChunkContext.class));
} catch (Exception e) {
e.printStackTrace();
}
bean = spiedTasklet;
}
return bean;
}
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
}
Configuration for the test:
<bean class="com.company.test.package.postprocessor.SpiedCommandTaskletPostProcessor" />
This part works like a charm. However, this leaves me with two questions: firstly, by the time the PostProcessor runs, the properties (i.e. the parameters for the SystemCommandTasklet I'm trying to verify) have already been presumably set; would I even be able to use Mockito.verify(...) to check for a desired value?
Secondly, since the SystemCommandTasklet is step-scoped, I can't use autowire to get a hold of it in my test, and trying to look it up in the ApplicationContext just creates a new (and real) instance of the Tasklet outside of the step. How can I access a step scoped tasklet like this in the context of a JUnit test?
#Test
public void testLaunch() throws Exception {
JobExecution jobExecution = getJobLauncher().launchJob(jobParms);
//Mockito.verify(spiedTasklet).execute(Mockito.any(StepContribution.class), Mockito.any(ChunkContext.class)); spiedTasklet not wired to anything. :(
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
Is there a better approach that accomplishes testing that the script is executed with the correct parameters without actually running a script?

Resources