How to call a stored procedure from a Spring Batch Tasklet? - spring

It is mentioned that a TaskletStep in Spring Batch can be used to call a stored procedure. Could anyone provide an example of how to invoke a Stored Procedure from a TaskletStep? So far I have done this but it throws an exception saying "Configuration problem: The element [callStoredProcedure] is unreachable"
<job id="job1">
<step id="step1">
<tasklet ref="myTasklet"/>
</step>
</job>
<bean id="myTasklet" class="MyClass">
<property name="dataSource" ref="dataSource"/>
<property name="sql" value="call stored_procedure()"/>
</bean>
Java Class
class MyClass implements Tasklet{
#Override
public RepeatStatus execute(StepContribution contribution,
ChunkContext chunkContext) throws Exception {
JdbcTemplate myJDBC=new JdbcTemplate(getDataSource());
myJDBC.execute(sql);
return RepeatStatus.FINISHED;
}
}
How and where should the stored procedure be configured? Would be grateful to receive any pointers?

Instead of
value="call stored_procedure()"
just put
value="stored_procedure"
without () on end. That should resolve your issue

Related

Get jobExecutionContext in xml config spring batch from before step

I am defining my MultiResourceItemReader on this way:
<bean id="multiDataItemReader" class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['filesResource']}"/>
<property name="delegate" ref="dataItemReader"/>
</bean>
How you can see I want read from the jobExecutionContext the "filesResource" value.
Note: I changed some names to keep the "code privacy". This is executing, Is somebody wants more info please tell me.
I am saving this value in my first step and I am using the reader in the second step, Should I have access to it?
I am saving it in the final lines from my step1 tasklet:
ExecutionContext jobContext = context.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
jobContext.put("filesResource", resourceString);
<batch:job id="myJob">
<batch:step id="step1" next="step2">
<batch:tasklet ref="moveFilesFromTasklet" />
</batch:step>
<batch:step id="step2">
<tasklet>
<chunk commit-interval="500"
reader="multiDataItemReader"
processor="dataItemProcessor"
writer="dataItemWriter" />
</tasklet>
</batch:step>
</batch:job>
I am not really sure what I am forgetting to get the value. The error that I am getting is:
20190714 19:49:08.120 WARN org.springframework.batch.item.file.MultiResourceItemReader [[ # ]] - No resources to read. Set strict=true if this should be an error condition.
I see nothing wrong with your config. The value of resourceString should be an array of org.springframework.core.io.Resource as this is the parameter type of the resources attribute of MultiResourceItemReader.
You can pass an array or a list of String with the absolute path to each resource and it should work. Here is a quick example:
class MyTasklet implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
List<String> resources = Arrays.asList(
"/full/path/to/resource1",
"/full/path/to/resource2");
chunkContext.getStepContext().getStepExecution().getJobExecution().getExecutionContext()
.put("filesResource", resources);
return RepeatStatus.FINISHED;
}
}

Spring Batch - reading multiple PDF files and passing them to ItemProcessor

I would like to read a multiple pdf files and process them one by one.
I use MultiResourceItemReader and a custom delegate:
public class MyItemReader implements ResourceAwareItemReaderItemStream<MyItem> {
private Resource resource;
#Override
public MyItem read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
return null; //create MyItem
}
#Override
public void setResource(Resource resource) {
this.resource = resource;
}
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {
}
#Override
public void close() throws ItemStreamException {
}
}
The problem I have is that the read method is ivoked infinitly and my ItemProcessor is not invoked.
The resources property is correctly set - files are set.
Could anyone explain me this? Thanks in advance.
I finally decided to use ResourcesItemReader instead of MultiResourceItemReader with custom delegate. This solution is simpler.
<!--suppress SpringBatchModel -->
<batch:job id="my-import">
<batch:step id="myFileStep">
<batch:tasklet>
<batch:chunk reader="resourcesItemReader"
processor="sddeImportProcessor"
writer="sddeImportJpaItemWriter"
commit-interval="${commit.interval:500}"/>
</batch:tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="sftpImportExecutionListener"/>
<batch:listener ref="longRunningJobExecutionNotificator"/>
<batch:listener ref="exitStatusJobExecutionListener"/>
<batch:listener ref="afterJobExecutionMailSender"/>
</batch:listeners>
</batch:job>
<bean id="sftpImportExecutionListener"
class="my.batches.shared.listener.SftpImportJobListener">
<constructor-arg name="ftsReadService" ref="ftsReadService"/>
<constructor-arg name="ftsWriterService" ref="ftsWriterService"/>
<constructor-arg name="localDir" value="${voe.batch.sdde.unterschriftenblatt.import.local.folder}"/>
<constructor-arg name="remoteDir" value="${voe.batch.sdde.unterschriftenblatt.import.remote.folder}"/>
<constructor-arg name="multipleFilesImport" value="true" />
</bean>
<bean id="resourcesItemReader" class="org.springframework.batch.item.file.ResourcesItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['import.input.file.path']}"/>
</bean>
<bean id="myImportProcessor" class="my.MyProcessor">
<property name="myUpdateService" ref="defaultUpdateService" />
</bean>
<bean id="myImportJpaItemWriter" class="org.springframework.batch.item.database.JpaItemWriter">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>

Mockito Spy SystemCommandTasklet in a Spring Batch Step

I am re-factoring a troublesome test I have inherited for a Spring Batch job that calls an external script to perform a task. The original version of the test substituted the real script with a simple move file script and started the job, then called that script and tested that a file was moved correctly (which served to verify that the script was called with the correct parameters).
My goal for the new version is to eliminate the need to call a real script for this test, instead stubbing and verifying the execution method of the tasklet being used to call the script, relying on Mockito's verify to ensure that the correct parameters are used.
The configuration the job is as follows:
<flow id="job-flow">
<step id="preprocess" parent="preprocess-base" next="process">
<tasklet>
<beans:bean class="com.company.project.main.package.PreprocessTasklet">
<beans:property name="doMove" value="false" />
</beans:bean>
</tasklet>
</step>
<step id="process" next="postprocess">
<tasklet ref="commandTasklet" />
</step>
<step id="postprocess" parent="postprocess-base" />
</flow>
<bean id="commandTasklet" class="org.springframework.batch.core.step.tasklet.SystemCommandTasklet" scope="step">
<property name="command" value="${a.dir}/job_script.sh" />
<property name="environmentParams" value="working_dir=#{jobExecutionContext['job.dir']}" />
<property name="workingDirectory" value="${b.dir}" />
<property name="timeout" value="3600000"/>
</bean>
<batch:job id="run-my-script" parent="base-job" incrementer="defaultIncrementer">
<batch:flow id="script-job" parent="job-flow" />
</batch:job>
In order to prevent commandTasklet from invoking the actual shell script, I use a BeanPostProcessor to replace it with a spy and stub the execute method.
public class SpiedCommandTaskletPostProcessor implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (SystemCommandTasklet.class.isAssignableFrom(bean.getClass())) {
SystemCommandTasklet spiedTasklet = Mockito.spy((SystemCommandTasklet) bean);
try {
Mockito.doReturn(RepeatStatus.FINISHED).when(spiedTasklet)
.execute(Mockito.any(StepContribution.class), Mockito.any(ChunkContext.class));
} catch (Exception e) {
e.printStackTrace();
}
bean = spiedTasklet;
}
return bean;
}
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
}
Configuration for the test:
<bean class="com.company.test.package.postprocessor.SpiedCommandTaskletPostProcessor" />
This part works like a charm. However, this leaves me with two questions: firstly, by the time the PostProcessor runs, the properties (i.e. the parameters for the SystemCommandTasklet I'm trying to verify) have already been presumably set; would I even be able to use Mockito.verify(...) to check for a desired value?
Secondly, since the SystemCommandTasklet is step-scoped, I can't use autowire to get a hold of it in my test, and trying to look it up in the ApplicationContext just creates a new (and real) instance of the Tasklet outside of the step. How can I access a step scoped tasklet like this in the context of a JUnit test?
#Test
public void testLaunch() throws Exception {
JobExecution jobExecution = getJobLauncher().launchJob(jobParms);
//Mockito.verify(spiedTasklet).execute(Mockito.any(StepContribution.class), Mockito.any(ChunkContext.class)); spiedTasklet not wired to anything. :(
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
Is there a better approach that accomplishes testing that the script is executed with the correct parameters without actually running a script?

spring batch ItemReader FlatFileItemReader set cursor to start reading from a particular line or set linestoskip dynamically

In my springbatch+quartz setup, I am reading a CSV File using FlatFileItemReader. I want to set the cursor for the reader to start the next jobinstance with the given parameters for reader. Is it possible?
<bean id="cvsFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<!-- Read a csv file -->
<property name="resource" value="classpath:cvs/input/report.csv" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="id,impressions" />
</bean>
</property>
<property name="fieldSetMapper">
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="report" />
</bean>
</property>
</bean>
</property>
</bean>
The idea is to continue reading the file where last failure occured in the next execution. I am putting an integer 'writecursor' for each line written in my customWriter.
public void write(List<? extends Report> items) throws Exception {
System.out.println("writer..." + items.size() + " > ");
for(Report item : items){
System.out.println("writing item id: " + item.getId());
System.out.println(item);
}
//getting stepExecution by implementing StepExecutionListener
this.stepExecution.getExecutionContext().putInt("writecursor", ++writecursor);
}
Now, in the customItemReadListener, I want to get the update writecursor value and then skip the lines from the top to start reading from writecursor
public class CustomItemReaderListener implements ItemReadListener<Report>, StepExecutionListener {
ApplicationContext context = ApplicationContextUtils.getApplicationContext();
private StepExecution stepExecution;
#Override
public void beforeRead() {
//Skip lines somehow
}
Another thing I saw as a possible solution is to set linestoskip dynamically in itemreader. There is a thread here http://thisisurl.com/dynamic-value-for-linestoskip-in-itemreader but not answered yet. And here,
http://forum.spring.io/forum/spring-projects/batch/100950-accessing-job-execution-context-from-itemwriter
Use FlatFileItemReader.linesToSkip property setted injecting job Parameter value.
<bean id="myReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="linesToSkip" value="file:#{jobParameters['cursor']}" />
</bean>
A more easy way for implementing the lines to skip is by the following:
create a reader flat file reader in the xml, autowire the reader to the beforeStep of step execution listener as shown below
public class CustomStepListener implements StepExecutionListener {
#Autowired
#Qualifier("cvsFileItemReader")
FlatFileItemReader cvsFileItmRdr;
#Override
public void beforeStep(StepExecution stepExecution) {
System.out.println("StepExecutionListener -------- beforeStep");
cvsFileItmRdr.setLinesToSkip(4);
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
System.out.println("StepExecutionListener - afterStep");
return null;
}
}
it is working fine for me.....

I want the current resource processed by MultiResourceItemReader available in beforeStep method

Is there any way to make the currentResource processed by MultiResourceItemReader to make it available in the beforeStep method.Kindly provide me a working code sample. I tried injected multiresourcereader reference in to stepexecutionlistener , but the spring cglib only accepts an interface type to be injected ,i dont know whether to use ItemReader or ItemStream interface.
The MultiResourceItemReader now has a method getCurrentResource() that returns the current Resource.
Retrieving of currentResource from MultiResourceItemReader is not possible at the moment. If you need this API enhancement, create one in Spring Batch JIRA.
Even if there is a getter for currentResource, it's value is not valid in beforeStep(). It is valid between open() and close().
it is possible, if you use a Partition Step and the Binding Input Data to Steps concept
simple code example, with concurrency limit 1 to imitate "serial" processing:
<bean name="businessStep:master" class="org.springframework.batch.core.partition.support.PartitionStep">
<property name="jobRepository" ref="jobRepository"/>
<property name="stepExecutionSplitter">
<bean class="org.springframework.batch.core.partition.support.SimpleStepExecutionSplitter">
<constructor-arg ref="jobRepository"/>
<constructor-arg ref="concreteBusinessStep"/>
<constructor-arg>
<bean class="org.spring...MultiResourcePartitioner" scope="step">
<property name="resources" value="#{jobParameters['input.file.pattern']}"/>
</bean>
</constructor-arg>
</bean>
</property>
<property name="partitionHandler">
<bean class="org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler">
<property name="taskExecutor">
<bean class="org.springframework.core.task.SimpleAsyncTaskExecutor">
<property name="concurrencyLimit" value="1" />
</bean>
</property>
<property name="step" ref="concreteBusinessStep"/>
</bean>
</property>
</bean>
<bean id="whateverClass" class="..." scope="step">
<property name="resource" value="#{stepExecutionContext['fileName']}" />
</bean>
example step configuration:
<job id="renameFilesPartitionJob">
<step id="businessStep"
parent="businessStep:master" />
</job>
<step id="concreteBusinessStep">
<tasklet>
<chunk reader="itemReader"
writer="itemWriter"
commit-interval="5" />
</tasklet>
</step>
potential drawbacks:
distinct steps for each file instead of one step
more complicated configuration
Using the getCurrentResource() method from MultiResourceItemReader, update the stepExecution. Example code as below
private StepExecution stepExecution;
#Override
public Resource getCurrentResource() {
this.stepExecution.getJobExecution().getExecutionContext().put("FILE_NAME", super.getCurrentResource().getFilename());
return super.getCurrentResource();
}
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
If you make the item you are reading implement ResourceAware, the current resource is set as it is read
public class MyItem implements ResourceAware {
private Resource resource;
//others
public void setResource(Resource resource) {
this.resource = resource;
}
public Resource getResource() {
return resource;
}
}
and in your reader, processor or writer
myItem.getResource()
will return the resource it was loaded from

Resources