Mockito Spy SystemCommandTasklet in a Spring Batch Step - spring

I am re-factoring a troublesome test I have inherited for a Spring Batch job that calls an external script to perform a task. The original version of the test substituted the real script with a simple move file script and started the job, then called that script and tested that a file was moved correctly (which served to verify that the script was called with the correct parameters).
My goal for the new version is to eliminate the need to call a real script for this test, instead stubbing and verifying the execution method of the tasklet being used to call the script, relying on Mockito's verify to ensure that the correct parameters are used.
The configuration the job is as follows:
<flow id="job-flow">
<step id="preprocess" parent="preprocess-base" next="process">
<tasklet>
<beans:bean class="com.company.project.main.package.PreprocessTasklet">
<beans:property name="doMove" value="false" />
</beans:bean>
</tasklet>
</step>
<step id="process" next="postprocess">
<tasklet ref="commandTasklet" />
</step>
<step id="postprocess" parent="postprocess-base" />
</flow>
<bean id="commandTasklet" class="org.springframework.batch.core.step.tasklet.SystemCommandTasklet" scope="step">
<property name="command" value="${a.dir}/job_script.sh" />
<property name="environmentParams" value="working_dir=#{jobExecutionContext['job.dir']}" />
<property name="workingDirectory" value="${b.dir}" />
<property name="timeout" value="3600000"/>
</bean>
<batch:job id="run-my-script" parent="base-job" incrementer="defaultIncrementer">
<batch:flow id="script-job" parent="job-flow" />
</batch:job>
In order to prevent commandTasklet from invoking the actual shell script, I use a BeanPostProcessor to replace it with a spy and stub the execute method.
public class SpiedCommandTaskletPostProcessor implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (SystemCommandTasklet.class.isAssignableFrom(bean.getClass())) {
SystemCommandTasklet spiedTasklet = Mockito.spy((SystemCommandTasklet) bean);
try {
Mockito.doReturn(RepeatStatus.FINISHED).when(spiedTasklet)
.execute(Mockito.any(StepContribution.class), Mockito.any(ChunkContext.class));
} catch (Exception e) {
e.printStackTrace();
}
bean = spiedTasklet;
}
return bean;
}
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
}
Configuration for the test:
<bean class="com.company.test.package.postprocessor.SpiedCommandTaskletPostProcessor" />
This part works like a charm. However, this leaves me with two questions: firstly, by the time the PostProcessor runs, the properties (i.e. the parameters for the SystemCommandTasklet I'm trying to verify) have already been presumably set; would I even be able to use Mockito.verify(...) to check for a desired value?
Secondly, since the SystemCommandTasklet is step-scoped, I can't use autowire to get a hold of it in my test, and trying to look it up in the ApplicationContext just creates a new (and real) instance of the Tasklet outside of the step. How can I access a step scoped tasklet like this in the context of a JUnit test?
#Test
public void testLaunch() throws Exception {
JobExecution jobExecution = getJobLauncher().launchJob(jobParms);
//Mockito.verify(spiedTasklet).execute(Mockito.any(StepContribution.class), Mockito.any(ChunkContext.class)); spiedTasklet not wired to anything. :(
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
Is there a better approach that accomplishes testing that the script is executed with the correct parameters without actually running a script?

Related

Get jobExecutionContext in xml config spring batch from before step

I am defining my MultiResourceItemReader on this way:
<bean id="multiDataItemReader" class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resources" value="#{jobExecutionContext['filesResource']}"/>
<property name="delegate" ref="dataItemReader"/>
</bean>
How you can see I want read from the jobExecutionContext the "filesResource" value.
Note: I changed some names to keep the "code privacy". This is executing, Is somebody wants more info please tell me.
I am saving this value in my first step and I am using the reader in the second step, Should I have access to it?
I am saving it in the final lines from my step1 tasklet:
ExecutionContext jobContext = context.getStepContext().getStepExecution().getJobExecution().getExecutionContext();
jobContext.put("filesResource", resourceString);
<batch:job id="myJob">
<batch:step id="step1" next="step2">
<batch:tasklet ref="moveFilesFromTasklet" />
</batch:step>
<batch:step id="step2">
<tasklet>
<chunk commit-interval="500"
reader="multiDataItemReader"
processor="dataItemProcessor"
writer="dataItemWriter" />
</tasklet>
</batch:step>
</batch:job>
I am not really sure what I am forgetting to get the value. The error that I am getting is:
20190714 19:49:08.120 WARN org.springframework.batch.item.file.MultiResourceItemReader [[ # ]] - No resources to read. Set strict=true if this should be an error condition.
I see nothing wrong with your config. The value of resourceString should be an array of org.springframework.core.io.Resource as this is the parameter type of the resources attribute of MultiResourceItemReader.
You can pass an array or a list of String with the absolute path to each resource and it should work. Here is a quick example:
class MyTasklet implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
List<String> resources = Arrays.asList(
"/full/path/to/resource1",
"/full/path/to/resource2");
chunkContext.getStepContext().getStepExecution().getJobExecution().getExecutionContext()
.put("filesResource", resources);
return RepeatStatus.FINISHED;
}
}

How to call a stored procedure from a Spring Batch Tasklet?

It is mentioned that a TaskletStep in Spring Batch can be used to call a stored procedure. Could anyone provide an example of how to invoke a Stored Procedure from a TaskletStep? So far I have done this but it throws an exception saying "Configuration problem: The element [callStoredProcedure] is unreachable"
<job id="job1">
<step id="step1">
<tasklet ref="myTasklet"/>
</step>
</job>
<bean id="myTasklet" class="MyClass">
<property name="dataSource" ref="dataSource"/>
<property name="sql" value="call stored_procedure()"/>
</bean>
Java Class
class MyClass implements Tasklet{
#Override
public RepeatStatus execute(StepContribution contribution,
ChunkContext chunkContext) throws Exception {
JdbcTemplate myJDBC=new JdbcTemplate(getDataSource());
myJDBC.execute(sql);
return RepeatStatus.FINISHED;
}
}
How and where should the stored procedure be configured? Would be grateful to receive any pointers?
Instead of
value="call stored_procedure()"
just put
value="stored_procedure"
without () on end. That should resolve your issue

How to write a spring batch step without an itemwriter

I am trying to configure a spring batch step without an item writer using below configuraion. However i get error saying that writer
element has neither a 'writer' attribute nor a element.
I went through the link spring batch : Tasklet without ItemWriter. But could not resolve issue.
Could any one tell me the specific changes to be made in the code snippet I mentioned
<batch:job id="helloWorldJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="cvsFileItemReader"
commit-interval="10">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="cvsFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="classpath:cvs/input/report.csv" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="id,sales,qty,staffName,date" />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="com.mkyong.ReportFieldSetMapper" />
<!-- if no data type conversion, use BeanWrapperFieldSetMapper to map by name
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="report" />
</bean>
-->
</property>
</bean>
</property>
</bean>
For chunk-based step reader and writer are mandatory.
If you don't want a writer use a No-operation ItemWriter that does nothing.
EDIT:
A no-op implementation is an empty implementation of interface tha does...nothing!
Just let your class implements desiderable inteface(s) with empty methods.
No-op ItemWriter:
public class NoOpItemWriter implements ItemWriter {
void write(java.util.List<? extends T> items) throws java.lang.Exception {
// no-op
}
}
I hope you got answer but I want to explain it for other readers, When we use chunk then usually we declare reader, processor and writer. In chunk reader and writer are mandatory and processor is optional. In your case if you don't need writer then u need to make a class which implements ItemWriter. Override write method and keep it blank. Now create a bean of writer class and pass it as reference of writer.
<batch:step id="recordProcessingStep" >
<batch:tasklet>
<batch:chunk reader="fileReader" processor="recordProcessor"
writer="rocordWriter" commit-interval="1" />
</batch:tasklet>
</batch:step>
Your writer class will look like .
public class RecordWriter<T> implements ItemWriter<T> {
#Override
public void write(List<? extends T> items) throws Exception {
// TODO Auto-generated method stub
}
}
In maven repo you can find the framework "spring-batch-samples".
In this framework you will find this Writer :
org.springframework.batch.sample.support.DummyItemWriter

How to execute SQL script only once at startup in Spring?

I have a web application based on Spring JDBC and Jersey RESTful web service. I'm using the following Spring JDBC template class to initiate the dataSource and execute an SQL script (update_condition_table.sql):
public class CustomerJDBCTemplate implements CustomerDAO {
private DataSource dataSource;
private JdbcTemplate jdbcTemplateObject;
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
this.jdbcTemplateObject = new JdbcTemplate(dataSource);
Resource rc = new ClassPathResource("update_condition_table.sql");
JdbcTestUtils.executeSqlScript(jdbcTemplateObject, rc, false);
}
// ......other methods
}
The bean configuration file is beans.xml:
<!-- Initialization for data source -->
<bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="com.mysql.jdbc.Driver" />
<property name="url" value="jdbc:mysql://localhost:3306/customer" />
<property name="username" value="root" />
<property name="password" value="mypassword" />
</bean>
<!-- Definition for customerJDBCTemplate bean -->
<bean id="customerJDBCTemplate" class="com.example.db.CustomerJDBCTemplate">
<property name="dataSource" ref="dataSource" />
</bean>
The Jersey controller class contains the instantiation of class CustomerJDBCTemplate and serves as the REST web service:
#Path("/customer")
public class CustomerService {
ApplicationContext context = new ClassPathXmlApplicationContext("beans.xml");
CustomerJDBCTemplate dbController = (CustomerJDBCTemplate) context.getBean("customerJDBCTemplate");
// ... some GET/POST methods
}
When I launched my web app by entering the index URL in the browser, the SQL script gets executed by the customerJDBCTemplate bean. However, when I clicked to navigate to other pages, it crashed and reported that the SQL script cannot be executed again. So obviously the SQL script was executed again after initialization of dataSource and initial launch of the index web page. How to avoid this by just running the SQL script only once upon initial startup of the web app?
Looks like I need to move the bean instantiate code out of CustomerService class, but where should I put that code?
I figured it out that I should set the bean application context to be static within CustomerService class and do it in the static initialization block as follows:
#Path("/customer")
public class CustomerService {
private static ApplicationContext context;
private static CustomerJDBCTemplate dbController;
static {
context = new ClassPathXmlApplicationContext("beans.xml");
dbController = (CustomerJDBCTemplate) context.getBean("customerJDBCTemplate");
}
//... other methods
}
I guess the reason is Jersey creates a different instance of CustomerService for each HTTP session (correct me if I'm wrong). So if I set the bean context as instance variable, it will do the initialization for every HTTP request.
Have your CustomerJDBCTemplate implement InitializingBean. afterPropertiesSet will get called once, right after all properties have been set by Spring's BeanFactory.
For example:
public class CustomerJDBCTemplate implements CustomerDAO, InitializingBean {
...
// ......other methods
public void afterPropertiesSet() throws Exception {
//do your initializing, or call your initializing methods
}
}

Autowiring in Spring 3 MDP

Firstly I've checked some of the possible answers that come up when posting a new question and none that I have come across deals with my issue.
I have a Spring MDP which works nicely i.e. can receive messages. The problem is when I try to autowire a dependency, the autowiring doesn't seem to work. I'm using Netbeans and Glassfish 3.1.2 so I'm able to step through the code and can confirm that the dependencies are null. Autowiring in other parts of the application are working fine. The MDP is picked up in the component-scan.
I used the example from springsource to create my MDP:
http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/jms.html
And I've autowired the dependencies by setter methods.
I cannot figure out why this won't work. I've checked around and I don't think anyone else has had this issue.
Any ideas, pointers in the right direction, examples I can reference will be much appreciated.
Thanks.
KSS
MDP Class:
public class ExampleListener implements MessageListener {
private Transformer transformer;
private MurexService murexService;
#Autowired
public void setTransformer(Transformer transformer) {
this.transformer = transformer;
}
#Autowired
public void setMurexService(MurexService murexService) {
this.murexService = murexService;
}
#Override
public void onMessage(Message message) {
if (message instanceof TextMessage) {
try {
System.out.println(((TextMessage) message).getText());
} catch (JMSException ex) {
throw new RuntimeException(ex);
}
}
}
}
ApplicationContext:
<jee:jndi-lookup id="connectionFactory" jndi-name="jms/QueueConnectionFactory" />
<jee:jndi-lookup id="testQueueOne" jndi-name="jms/ITFS_RECEIVE" />
<!-- this is the Message Driven POJO (MDP) -->
<bean id="messageListener" class="com.scm.service.ExampleListener" />
<!-- and this is the message listener container -->
<bean id="jmsContainer" class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="connectionFactory" ref="connectionFactory"/>
<property name="destination" ref="testQueueOne"/>
<property name="messageListener" ref="messageListener" />
</bean>
An AutowiredAnnotationBeanPostProcessor needs to be registered for wiring in the #Autowired fields. The javadoc has more details. See here for the solution to a similar issue.
Essentially adding this should get the autowiring to work:
<context:annotation-config/>

Resources