Spring JdbcCursorItemReader in spring batch application - spring

My use case is as follows:
There is an Employee table and columns are as follows:
employee_id;
empoyee_dob;
employee_lastName;
employee_firstName;
employee_zipCode
Now there is an use-case to build a list of Employees present in Dept 'A' and zipcode 11223 and also employees present in Dept B and zipcode 33445.
I have configured a spring job as follows:
<batch:job id="EmployeeDetailsJob" job-repository="EmpDaoRepository">
<batch:step id="loadEmployeeDetails" >
<batch:tasklet transaction-manager="EmployeeDAOTranManager">
<batch:chunk reader="EmpDaoJdbcCursorItemReader" writer="EmpDaoWriter" commit-interval="200" skip-limit="100">
<batch:skippable-exception-classes>
</batch:skippable-exception-classes>
</batch:chunk>
<batch:listeners>
<batch:listener ref="EmpDaoStepListener" />
</batch:listeners>
<batch:transaction-attributes isolation="DEFAULT" propagation="REQUIRED" timeout="300" />
</batch:tasklet>
</batch:step>
</batch:job>
The configuration of reader is as follows:
<bean id="EmpDaoJdbcCursorItemReader" class="EmpDaoJdbcCursorItemReader">
<property name="dataSource" ref="EmpDataSource" />
<property name="sql">
<value><![CDATA[select * from Employee where employee_id=? and employee_zipCode=? ]]>
</value>
</property>
<property name="fetchSize" value="100"></property>
<property name="rowMapper" ref="EmployeeMapper" />
</bean>
There is class EmployeeQueryCriteria which has two fields employee_id and employee_zipCode.
In on of the steps i will create an ArrayList of EmployeeQueryCriteria objects for which the data has to be fetched.
So my question is:
1.Is there a way i can pass this List to the EmpDaoJdbcCursorItemReader and it will iterate through the object and set the parameter values from the EmployeeQueryCriteria object
2.Can i loop through the step to read data for every item in the ArrayList created containing EmployeeQueryCriteria and fetch the data.
The class EmpDaoJdbcCursorIte‌​mReader:
public class EmpDaoJdbcCursorIte‌​mReader extends JdbcCursorItemReader{
#BeforeStep
public void beforeStep(StepExecution stepExecution)
{
StringBuffer sqlQuerySB= new StringBuffer(super.getSql());
sqlQuerySB.append((").append(/*I am adding a comma seperated list of employee ids*/).append(")");
super.setSql(sqlQuerySB.toString());
}
}
My Spring configurations are as follows:
Spring-batch-core 2.2.2
Spring-beans 3.2.3
Spring-context 3.2.3
Can someone please provide suggestions on how to solve this problem.

you can iterate through the steps by following code model
<decision id="testLoop" decider="iterationDecider">
<next on="CONTINUABLE" to="pqrStep" />
<end on="FINISHED" />
</decision>
<step id="pqrStep" next="xyzStep">
<tasklet ref="someTasklet" />
</step>
<step id="xyzStep" next="testLoop">
<tasklet ref="someOtherTasklet" />
</step>
and Configuration is
<bean id="iterationDecider" class="com.xyz.StepFlowController" />
Following class will handle the flow based on the condition
public class StepFlowController implements JobExecutionDecider{
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
FlowExecutionStatus status = null;
try {
if (conditionIsTrue) {
status = new FlowExecutionStatus("CONTINUABLE");
}else {
status = new FlowExecutionStatus("FINISHED");
}
} catch (Exception e) {
e.printStackTrace();
}
return status;
}

Related

Get jobExecutionContext in xml config spring batch from before step

I am defining my MultiResourceItemReader on this way:
<bean id="multiDataItemReader" class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['filesResource']}"/>
<property name="delegate" ref="dataItemReader"/>
</bean>
How you can see I want read from the jobExecutionContext the "filesResource" value.
Note: I changed some names to keep the "code privacy". This is executing, Is somebody wants more info please tell me.
I am saving this value in my first step and I am using the reader in the second step, Should I have access to it?
I am saving it in the final lines from my step1 tasklet:
ExecutionContext jobContext = context.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
jobContext.put("filesResource", resourceString);
<batch:job id="myJob">
<batch:step id="step1" next="step2">
<batch:tasklet ref="moveFilesFromTasklet" />
</batch:step>
<batch:step id="step2">
<tasklet>
<chunk commit-interval="500"
reader="multiDataItemReader"
processor="dataItemProcessor"
writer="dataItemWriter" />
</tasklet>
</batch:step>
</batch:job>
I am not really sure what I am forgetting to get the value. The error that I am getting is:
20190714 19:49:08.120 WARN org.springframework.batch.item.file.MultiResourceItemReader [[ # ]] - No resources to read. Set strict=true if this should be an error condition.
I see nothing wrong with your config. The value of resourceString should be an array of org.springframework.core.io.Resource as this is the parameter type of the resources attribute of MultiResourceItemReader.
You can pass an array or a list of String with the absolute path to each resource and it should work. Here is a quick example:
class MyTasklet implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
List<String> resources = Arrays.asList(
"/full/path/to/resource1",
"/full/path/to/resource2");
chunkContext.getStepContext().getStepExecution().getJobExecution().getExecutionContext()
.put("filesResource", resources);
return RepeatStatus.FINISHED;
}
}

Spring Batch-How to process multiple records at the same time in the processor?

I have a file to parse and process records from. It is working fine as line-by-line (parsing one record at a time). My requirement is I've to parse thru multiple line and fetch the required information from each records and then after combining the fetched info from all the records, I call a service to perform business logic. I have to perform this logic inside my Processor class. The data looks like as below example:
001 123456 987654321551580 Wayne DR 1
001 123456 987654321552APT 786 1
001 123456 987654321553LOS ANGELES 1
001 123456 987654321554CA 1
001 123456 98765432155590001 1
The data element available at columns 30-32 is what I am interested to fetch from each record. In the above example, the values 551, 552, 553, 554, 555 in each line respectively. They all come in together in the file. So basically when the current item in my processor parses the first line and finds out that its '551' (means Address Line1 in business code), then I want to fetch the rest of the address that follows this line and save them in one complete address. At the end I want to pass this address to the service class from the processor and then move on to parse the next record available in the file. My problem is that the processor works on line by line for each record so this way I am not able to keep track/association between all these related lines.
Sorry if I am not able to explain my problem in an easier way..I am new to Spring Batch and still learning.
If you know the associated data records will be next to one another in the file (as opposed to spread out randomly), you can leverage the SingleItemPeekableItemReader to associate multiple lines to create one complete object. This older answer has a bit more info.
Example Context File:
<bean id="peekingReader" class="com.package.whatever.YourPeekingReader">
<property name="delegate" ref="flatFileItemReader"/>
</bean>
<bean id="flatFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="file://temp/file.txt" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer" ref="yourTokenizer"/>
<property name="fieldSetMapper" ref="yourMapper"/>
</bean>
</property>
</bean>
Example Peeking Reader:
public class YourPeekingReader extends SingleItemPeekableItemReader<YourObject> {
#Override
public YourObject read() {
YourObject item = super.read();
if (item == null) {
return null;
}
while (true) {
YourObject possibleRelatedObject = peek();
if (possibleRelatedObject == null) {
return item;
}
//logic to determine if next line in file relates to same object
boolean matches = false;
if (matches) {
item.addRelatedInfo(super.read());
} else {
return item;
}
}
}
}
#Dean..thanks again. To be more precise with my code, here it is
Customer-record-reader.xml
<batch:job id="myFileReaderJob">
<batch:step id="stepA" next="stepSuccess">
<batch:tasklet>
<batch:chunk reader="myInputReader" processor="myProcessor" writer="myWriter" commit-interval="1"/>
</batch:tasklet>
</batch:step>
<batch:step id="stepSuccess">
<batch:tasklet ref="successTasklet" />
</batch:step>
</batch:job>
<bean id="myInputReader" scope="step" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="lineMapper" ref="myLineMapper" />
</bean>
<bean id="myLineMapper" class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean id="fixedLengthLineTokenizer" class="org.springframework.batch.item.file.transform.FixedLengthTokenizer">
<property name="names" value="custRecord,tranId,partyId,uniquePartyId,deNum,deVal" />
<property name="columns" value="1-75,1-3,6-11,21-29,30-32,33-62" />
<property name="strict" value="false" />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="myInputData" />
</bean>
</property>
</bean>
As you can see, I am not using a custom implementation of ItemReader to wrap the FlatFileItemReader. Can you elaborate more in detail, on how to make changes in this code above to implement the SingleItemPeekableItemReader.
Thanks
#Dean
I tried implementing as per your suggestion
Config1.xml
<import resource="classpath*:/META-INF/java-batchlauncher/mainConfig.xml" />
<batch:job id="prT813FileReaderJob">
<batch:step id="stepA" next="stepB">
<batch:tasklet ref="aTasklet" />
</batch:step>
<batch:step id="stepB" next="stepSuccess">
<batch:tasklet>
<batch:chunk reader="prT813MultiReader" processor="participantRecordT813Processor" writer="prT813ItemWriter" commit-interval="1"/>
<batch:listeners>
<batch:listener ref="enabledFeaturesStepListener"/>
</batch:listeners>
<batch:transaction-attributes propagation="NEVER"/>
</batch:tasklet>
</batch:step>
<batch:step id="stepSuccess">
<batch:tasklet ref="successTasklet" />
</batch:step>
</batch:job>
My mainConfig.xml file changes:
<bean id="prT813MultiReader" scope="step" class="org.springframework.batch.item.file.MultiResourceItemReader">
<property name="resources" value="#{jobParameters[INPUT_FILES]}" />
<property name="delegate" ref="prT813InputReader" />
</bean>
<bean id="prT813MultiThreadedReader" scope="step" class="org.springframework.batch.item.file.MultiResourceItemReader">
<property name="resources" value="#{stepExecutionContext[fileName]}" />
<property name="delegate" ref="prT813InputReader" />
</bean>
<bean id="prT813InputReader" scope="step" class="com.fileprocessing.ParticipantRecordT813ItemReader">
<property name="delegate" ref="prT813CustomPeekableItemReader" />
</bean>
<bean id="prT813CustomPeekableItemReader" scope="step" class="org.springframework.batch.item.support.SingleItemPeekableItemReader">
<property name="delegate" ref="participantRecordT813ItemReader" />
</bean>
<bean id="participantRecordT813ItemReader" scope="step" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="lineMapper" ref="prT813LineMapper" />
</bean>
Created a new Reader class:
public class ParticipantRecordT813ItemReader extends SingleItemPeekableItemReader<ParticipantRecordT813InputData> {
private static final String CLASS = "ParticipantRecordT813ItemReader";
#Override
public ParticipantRecordT813InputData read() throws UnexpectedInputException, ParseException, Exception {
ParticipantRecordT813InputData item = super.read();
Log.report(CLASS, "I am in the reader ::::");
if (item != null) {
while (item.getDeNum()=="551") {
Log.report(CLASS, "I am in the reader at DE551::::" + item.getDeNum());
ParticipantRecordT813InputData possibleRelatedObject = peek();
if (possibleRelatedObject == null) {
return item;
}
//logic to determine if next line in file relates to same object
boolean matches = possibleRelatedObject.getDeNum()=="552";
if (matches) {
Log.report(CLASS, "I am in the reader at DE552::::" + possibleRelatedObject.getDeNum());
} else {
return item;
}
}
}
return item;
}
}
I am getting the below exception:
ERROR [main] (AbstractStep.java:225)- Encountered an error executing step stepB in job prT813FileReaderJob
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'scopedTarget.prT813MultiReader' defined in URL []: Initialization of bean failed; nested exception is org.springframework.beans.ConversionNotSupportedException: Failed to convert property value of type 'com.sun.proxy.$Proxy10 implementing org.springframework.batch.item.ItemStreamReader,org.springframework.batch.item.PeekableItemReader,java.io.Serializable,org.springframework.aop.scope.ScopedObject,org.springframework.aop.framework.AopInfrastructureBean,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised' to required type 'org.springframework.batch.item.file.ResourceAwareItemReaderItemStream' for property 'delegate'; nested exception is java.lang.IllegalStateException: Cannot convert value of type [com.sun.proxy.$Proxy10 implementing org.springframework.batch.item.ItemStreamReader,org.springframework.batch.item.PeekableItemReader,java.io.Serializable,org.springframework.aop.scope.ScopedObject,org.springframework.aop.framework.AopInfrastructureBean,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised] to required type [org.springframework.batch.item.file.ResourceAwareItemReaderItemStream] for property 'delegate': no matching editors or conversion strategy found
As you can see that prT813MultiReader and prT813MultiThreadedReader of type MultiResourceItemReader and I delegate them to prT813InputReader of type SingleItemPeekableItemReader.
I tried implementing ResourceAwareItemReaderItemStream in my reader class which get rid of the above exception but then it complaints on ParticipantRecordT813InputData item = super.read(); for nullPointerException.
public class ParticipantRecordT813ItemReader extends SingleItemPeekableItemReader<ParticipantRecordT813InputData> implements ResourceAwareItemReaderItemStream<ParticipantRecordT813InputData> {
private static final String CLASS = "ParticipantRecordT813ItemReader";
SingleItemPeekableItemReader<ParticipantRecordT813InputData> delegate = new SingleItemPeekableItemReader<ParticipantRecordT813InputData>();
#Override
public ParticipantRecordT813InputData read() throws UnexpectedInputException, ParseException, Exception {
ParticipantRecordT813InputData item = super.read();
Log.report(CLASS, "I am in the reader ::::");
if (item != null) {
while (item.getDeNum()=="551") {
Log.report(CLASS, "I am in the reader at DE551::::" + item.getDeNum());
ParticipantRecordT813InputData possibleRelatedObject = peek();
if (possibleRelatedObject == null) {
return item;
}
//logic to determine if next line in file relates to same object
boolean matches = possibleRelatedObject.getDeNum()=="552";
if (matches) {
Log.report(CLASS, "I am in the reader at DE552::::" + possibleRelatedObject.getDeNum());
} else {
return item;
}
}
}
return item;
}
#Override
public void close() throws ItemStreamException {
// TODO Auto-generated method stub
super.close();
}
#Override
public void open(ExecutionContext arg0) throws ItemStreamException {
// TODO Auto-generated method stub
super.open(arg0);
}
#Override
public void update(ExecutionContext arg0) throws ItemStreamException {
// TODO Auto-generated method stub
super.update(arg0);
}
#Override
public void setResource(Resource arg0) {
// TODO Auto-generated method stub
super.setDelegate(delegate);
}
}
Any idea where I am wrong????

Why Spring batch executing as singleton instead of multithread

I am invoking a spring batch job through quartz scheduler, which should run every 1 minute.
When the job runs the first time, the ItemReader is opened successfully and the job runs. However when the job attempts to run a second time, it's using the same instance it did the first time which is already initialized and receiving "java.lang.IllegalStateException: Stream is already initialized. Close before re-opening." I have set scope as step for both itemreader and itemwriter.
Please let me know if I am doing anything wrong in configuration?
<?xml version="1.0" encoding="UTF-8"?>
<import resource="context.xml"/>
<import resource="database.xml"/>
<bean id="MyPartitioner" class="com.MyPartitioner" />
<bean id="itemProcessor" class="com.MyProcessor" scope="step" />
<bean id="itemReader" class="com.MyItemReader" scope="step">
<property name="dataSource" ref="dataSource"/>
<property name="sql" value="query...."/>
<property name="rowMapper">
<bean class="com.MyRowMapper" scope="step"/>
</property>
</bean>
<job id="MyJOB" xmlns="http://www.springframework.org/schema/batch">
<step id="masterStep">
<partition step="slave" partitioner="MyPartitioner">
<handler grid-size="10" task-executor="taskExecutor"/>
</partition>
</step>
</job>
<step id="slave" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="itemReader" writer="mysqlItemWriter" processor="itemProcessor" commit-interval="100"/>
</tasklet>
</step>
<bean id="taskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="20"/>
<property name="maxPoolSize" value="20"/>
<property name="allowCoreThreadTimeOut" value="true"/>
</bean>
<bean id="mysqlItemWriter" class="com.MyItemWriter" scope="step">
<property name="dataSource" ref="dataSource"/>
<property name="sql">
<value>
<![CDATA[
query.....
]]>
</value>
</property>
<property name="itemPreparedStatementSetter">
<bean class="com.MyPreparedStatementSetter" scope="step"/>
</property>
</bean>
Quartz job invoker-
Scheduler scheduler = new StdSchedulerFactory("quartz.properties").getScheduler();
JobKey jobKey = new JobKey("QUARTZJOB", "QUARTZJOB");
JobDetail jobDetail = JobBuilder.newJob("com.MySpringJobInvoker").withIdentity(jobKey).build();
jobDetail.getJobDataMap().put("jobName", "SpringBatchJob");
SimpleTrigger smplTrg = newTrigger().withIdentity("QUARTZJOB", "QUARTZJOB").startAt(new Date(startTime))
.withSchedule(simpleSchedule().withIntervalInSeconds(frequency).withRepeatCount(repeatCnt))
.forJob(jobDetail).withPriority(5).build();
scheduler.scheduleJob(jobDetail, smplTrg);
Quartz job -
public class MySpringJobInvoker implements Job
{
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException
{
JobDataMap data = jobExecutionContext.getJobDetail().getJobDataMap();
ApplicationContext applicationContext =ApplicationContextUtil.getInstance();
JobLauncher jobLauncher = (JobLauncher) applicationContext.getBean("jobLauncher");
org.springframework.batch.core.Job job = (org.springframework.batch.core.Job) applicationContext.getBean(data.getString("jobName"));
JobParameters param = new JobParametersBuilder().addString("myparam","myparam").addString(Long.toString(System.currentTimeMillis(),Long.toString(System.currentTimeMillis())).toJobParameters();
JobExecution execution = jobLauncher.run(job, param);
}
}
Singletonclass -
public class ApplicationContextUtil
{
private static ApplicationContext applicationContext;
public static synchronized ApplicationContext getInstance()
{
if(applicationContext == null)
{
applicationContext = new ClassPathXmlApplicationContext("myjob.xml");
}
return applicationContext;
}
}
what are the parameters that you are passing to the Spring Batch job from Quartz? Could you post the exception stack trace?
If your trying to execute the second instance of the batch with the same parameters - it won't work. Spring Batch identifies a unique instance of the job based on parameters passed - so every new instance of the job requires different parameters to be passed.

Spring Batch - reading multiple PDF files and passing them to ItemProcessor

I would like to read a multiple pdf files and process them one by one.
I use MultiResourceItemReader and a custom delegate:
public class MyItemReader implements ResourceAwareItemReaderItemStream<MyItem> {
private Resource resource;
#Override
public MyItem read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
return null; //create MyItem
}
#Override
public void setResource(Resource resource) {
this.resource = resource;
}
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {
}
#Override
public void close() throws ItemStreamException {
}
}
The problem I have is that the read method is ivoked infinitly and my ItemProcessor is not invoked.
The resources property is correctly set - files are set.
Could anyone explain me this? Thanks in advance.
I finally decided to use ResourcesItemReader instead of MultiResourceItemReader with custom delegate. This solution is simpler.
<!--suppress SpringBatchModel -->
<batch:job id="my-import">
<batch:step id="myFileStep">
<batch:tasklet>
<batch:chunk reader="resourcesItemReader"
processor="sddeImportProcessor"
writer="sddeImportJpaItemWriter"
commit-interval="${commit.interval:500}"/>
</batch:tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="sftpImportExecutionListener"/>
<batch:listener ref="longRunningJobExecutionNotificator"/>
<batch:listener ref="exitStatusJobExecutionListener"/>
<batch:listener ref="afterJobExecutionMailSender"/>
</batch:listeners>
</batch:job>
<bean id="sftpImportExecutionListener"
class="my.batches.shared.listener.SftpImportJobListener">
<constructor-arg name="ftsReadService" ref="ftsReadService"/>
<constructor-arg name="ftsWriterService" ref="ftsWriterService"/>
<constructor-arg name="localDir" value="${voe.batch.sdde.unterschriftenblatt.import.local.folder}"/>
<constructor-arg name="remoteDir" value="${voe.batch.sdde.unterschriftenblatt.import.remote.folder}"/>
<constructor-arg name="multipleFilesImport" value="true" />
</bean>
<bean id="resourcesItemReader" class="org.springframework.batch.item.file.ResourcesItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['import.input.file.path']}"/>
</bean>
<bean id="myImportProcessor" class="my.MyProcessor">
<property name="myUpdateService" ref="defaultUpdateService" />
</bean>
<bean id="myImportJpaItemWriter" class="org.springframework.batch.item.database.JpaItemWriter">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>

How to write a spring batch step without an itemwriter

I am trying to configure a spring batch step without an item writer using below configuraion. However i get error saying that writer
element has neither a 'writer' attribute nor a element.
I went through the link spring batch : Tasklet without ItemWriter. But could not resolve issue.
Could any one tell me the specific changes to be made in the code snippet I mentioned
<batch:job id="helloWorldJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="cvsFileItemReader"
commit-interval="10">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="cvsFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="classpath:cvs/input/report.csv" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="id,sales,qty,staffName,date" />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="com.mkyong.ReportFieldSetMapper" />
<!-- if no data type conversion, use BeanWrapperFieldSetMapper to map by name
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="report" />
</bean>
-->
</property>
</bean>
</property>
</bean>
For chunk-based step reader and writer are mandatory.
If you don't want a writer use a No-operation ItemWriter that does nothing.
EDIT:
A no-op implementation is an empty implementation of interface tha does...nothing!
Just let your class implements desiderable inteface(s) with empty methods.
No-op ItemWriter:
public class NoOpItemWriter implements ItemWriter {
void write(java.util.List<? extends T> items) throws java.lang.Exception {
// no-op
}
}
I hope you got answer but I want to explain it for other readers, When we use chunk then usually we declare reader, processor and writer. In chunk reader and writer are mandatory and processor is optional. In your case if you don't need writer then u need to make a class which implements ItemWriter. Override write method and keep it blank. Now create a bean of writer class and pass it as reference of writer.
<batch:step id="recordProcessingStep" >
<batch:tasklet>
<batch:chunk reader="fileReader" processor="recordProcessor"
writer="rocordWriter" commit-interval="1" />
</batch:tasklet>
</batch:step>
Your writer class will look like .
public class RecordWriter<T> implements ItemWriter<T> {
#Override
public void write(List<? extends T> items) throws Exception {
// TODO Auto-generated method stub
}
}
In maven repo you can find the framework "spring-batch-samples".
In this framework you will find this Writer :
org.springframework.batch.sample.support.DummyItemWriter

Resources