Test only one job out of many spring batch - spring

I have two jobs. I am trying to test 1 single job.
this is what I am trying:
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Autowired
#Qualifier("jobNumber1")
private Job job;
#Test
public void test() {
try {
jobLauncherTestUtils
.getJobLauncher()
.run(job, new JobParametersBuilder()
.addString("--spring.batch.job.names", "jobNumber1")
.toJobParameters());
} catch (Exception e) {
e.printStackTrace();
}
}
But when I see logs, it is running both jobs. How do I make it test only 1 test? Thanks
I have also tried to add a Job in in JobLauncherTestUtils
#Bean
public JobLauncherTestUtils jobLauncherTestUtils() throws Exception {
return new JobLauncherTestUtils() {
#Override
#Autowired
public void setJob(#Qualifier("jobNumber1") Job job) {
super.setJob(job);
}
};
}
and do jobLauncherTestUtils.launchJob(). Still both jobs are running.

You are passing a Spring Boot parameter (--spring.batch.job.names) as a Spring Batch parameter. So Spring Boot is not aware of it and will still run both jobs. You need to either:
pass the --spring.batch.job.names=jobNumber1 to the command line you are using to test your job
or add the spring.batch.job.names=jobNumber1 in the application.properties file of your test resources
Hope this helps.

Related

Spring quartz configuration

I need your help please:
i have a spring batch app that is running perfectly with the main job and step as shown below :
#Bean
public Job JobFinal(Step step1) {
return jobBuilderFactory
.get("JobFinal")
.incrementer(new RunIdIncrementer())
.start(step1)
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<A, B>chunk(2)
.reader(readerDB())
.processor(process())
.writer(writerCS())
.build();
}
this job is configured in a class "BatchConfig" :
And here is my main :
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
I want to add quartz configuration to run the job everyday at midnight.
I couldn't find helpful tutorial to understand how to configure quartz in my case and in which class exactly !
Thank you for helpp :)
You would need to use SchedulerFactory for automatically triggering the jobs using Quartz like below:
SchedulerFactory sf = new StdSchedulerFactory();
Scheduler sche = sf.getScheduler();
JobDetail job = newJob(myclass.class).withIdentity("myid", "myname").build();
CronTrigger trigger = newTrigger().withIdentity("mytriggerid", "myname").withSchedule(cronSchedule("0 0 * * *"))
.build();
sche.scheduleJob(job, trigger);
sche.start();
Official documentation:
http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html

Annotated Step Configuration in Spring Batch, how to test

I use spring batch with annotations only, I want to test a step which is configured like so:
#Bean("findMe")
#Qualifier("findMe")
public Step findMe() {
return stepBuilderFactory.get("findMe"). ... some step configuration
}
Test:
#Test
public shouldRunTheJob() {
JobLauncherTestUtils.launchJob("findMe");
}
I was not able to address the job, besides that I was able to test all other levels, how can I address a job annotated like this?
From what I understand from your question, you want to test a step and not a job.
Try using the following sample test class for your step test.
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = YourClassToTest.class)
public class StepTest {
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testStep() throws Exception {
JobExecution jobExecution = jobLauncherTestUtils.launchStep("findMe");
// your test case, e.g. assert something on the jobExecution
}
}
For more information please refer to the spring batch docs here.

Spring Batch Tomcat memory leak

I use
Tomcat 8.0.26
Spring Boot 1.2.6.RELEASE
Spring 4.2.1.RELEASE
Spring Batch 3.0.5.RELEASE
In my application I have a following Spring Batch config:
#Configuration
#EnableBatchProcessing
public class ReportJobConfig {
public static final String REPORT_JOB_NAME = "reportJob";
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private ReportService reportService;
#Bean(name = REPORT_JOB_NAME)
public Job reportJob() {
//#formatter:off
return jobBuilderFactory
.get(REPORT_JOB_NAME)
.flow(createRequestStep())
.on("*").to(retriveInfoStep())
.on("*").to(notifyAdminStep())
.end().build();
//#formatter:on
}
#Bean
public Step createRequestStep() {
return stepBuilderFactory.get("createRequest").tasklet(new CreateRequestTasklet(reportService)).build();
}
#Bean
public Step retrivePHIStep() {
return stepBuilderFactory.get("retriveInfo").tasklet(new RetriveInfoTasklet(reportService)).build();
}
#Bean
public Step notifyAdminStep() {
return stepBuilderFactory.get("notifyAdmin").tasklet(new NotifyAdminTasklet()).build();
}
}
This is how I run the job:
#Service
public class ReportJobServiceImpl implements ReportJobService {
final static Logger logger = LoggerFactory.getLogger(ReportJobServiceImpl.class);
#Autowired
#Qualifier(ReportJobConfig.REPORT_JOB_NAME)
private Job reportJob;
#Autowired
private JobLauncher jobLauncher;
#Override
public void runReportJob(String messageContent) throws JobExecutionAlreadyRunningException, JobRestartException,
JobInstanceAlreadyCompleteException, JobParametersInvalidException {
Map<String, JobParameter> parameters = new HashMap<>();
JobParameter reportIdParameter = new JobParameter(messageContent);
parameters.put(REPORT_ID, reportIdParameter);
jobLauncher.run(reportJob, new JobParameters(parameters));
}
}
Batch properties:
batch.jdbc.driver=com.mysql.jdbc.Driver
batch.jdbc.url=jdbc:mysql://localhost/database
batch.jdbc.user=user
batch.jdbc.password=password
batch.jdbc.testWhileIdle=true
batch.jdbc.validationQuery=SELECT 1
batch.drop.script=classpath:/org/springframework/batch/core/schema-drop-mysql.sql
batch.schema.script=classpath:/org/springframework/batch/core/schema-mysql.sql
batch.business.schema.script=classpath:/business-schema-mysql.sql
batch.database.incrementer.class=org.springframework.jdbc.support.incrementer.MySQLMaxValueIncrementer
batch.database.incrementer.parent=columnIncrementerParent
batch.lob.handler.class=org.springframework.jdbc.support.lob.DefaultLobHandler
batch.grid.size=50
batch.jdbc.pool.size=6
batch.verify.cursor.position=true
batch.isolationlevel=ISOLATION_SERIALIZABLE
batch.table.prefix=BATCH_
I deploy this application to Tomcat 8, perform some jobs and then undeploy application via Tomcat Web Application Manager.
With Java VisualVM tool I compared memory snapshots before and after and see that there are a lot of Spring Batch(org.springframework.batch.*) related objects still exist in memory:
Also, I run 1000 reportJob and got a huge memory consumption on my machine.. I have no idea what can be wrong right now..
What could be causing this issue ?
UPDATED
I have consumed ~1000 messages from AWS SQS queue. My JMS listener configured to consume 1 message at a time. During the execution I had a following heap histogram:
I really don't understand why do I need for example to have in memory 7932 instances of StepExecution.. or 5285 of JobExecution objects. Where is my mistake ?

run spring batch job from the controller

I am trying to run my batch job from a controller. It will be either fired up by a cron job or by accessing a specific link.
I am using Spring Boot, no XML just annotations.
In my current setting I have a service that contains the following beans:
#EnableBatchProcessing
#PersistenceContext
public class batchService {
#Bean
public ItemReader<Somemodel> reader() {
...
}
#Bean
public ItemProcessor<Somemodel, Somemodel> processor() {
return new SomemodelProcessor();
}
#Bean
public ItemWriter writer() {
return new CustomItemWriter();
}
#Bean
public Job importUserJob(JobBuilderFactory jobs, Step step1) {
return jobs.get("importUserJob")
.incrementer(new RunIdIncrementer())
.flow(step1)
.end()
.build();
}
#Bean
public Step step1(StepBuilderFactory stepBuilderFactory,
ItemReader<somemodel> reader,
ItemWriter<somemodel> writer,
ItemProcessor<somemodel, somemodel> processor) {
return stepBuilderFactory.get("step1")
.<somemodel, somemodel> chunk(100)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
}
As soon as I put the #Configuration annotation on top of my batchService class, job will start as soon as I run the application. It finished successfully, everything is fine. Now I am trying to remove #Configuration annotation and run it whenever I want. Is there a way to fire it from the controller?
Thanks!
You need to create a application.yml file in the src/main/resources and add following configuration:
spring.batch.job.enabled: false
With this change, the batch job will not automatically execute with the start of Spring Boot. And batch job will be triggered when specific link.
Check out my sample code here:
https://github.com/pauldeng/aws-elastic-beanstalk-worker-spring-boot-spring-batch-template
You can launch a batch job programmatically using JobLauncher which can be injected into your controller. See the Spring Batch documentation for more details, including this example controller:
#Controller
public class JobLauncherController {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
#RequestMapping("/jobLauncher.html")
public void handle() throws Exception{
jobLauncher.run(job, new JobParameters());
}
}
Since you're using Spring Boot, you should leave the #Configuration annotation in there and instead configure your application.properties to not launch the jobs on startup. You can read more about the autoconfiguration options for running jobs at startup (or not) in the Spring Boot documentation here: http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#howto-execute-spring-batch-jobs-on-startup

how to select which spring batch job to run based on application argument - spring boot java config

I have two independent spring batch jobs in the same project because I want to use the same infrastructure-related beans. Everything is configured in Java. I would like to know if there's a proper way to start the jobs independent based for example on the first java app argument in the main method for example. If I run SpringApplication.run only the second job gets executed by magic.
The main method looks like:
#ComponentScan
#EnableAutoConfiguration
public class Application {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(Application.class);
app.setWebEnvironment(false);
ApplicationContext ctx= app.run(args);
}
}
and the two jobs are configured as presented in the Spring Batch Getting Started tutorial on Spring.io. Here is the configuration file of the first job, the second being configured in the same way.
#Configuration
#EnableBatchProcessing
#Import({StandaloneInfrastructureConfiguration.class, ServicesConfiguration.class})
public class AddPodcastJobConfiguration {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
//reader, writer, processor...
}
To enable modularization I created an AppConfig class, where I define factories for the two jobs:
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
P.S. I am new to Spring configuration in Java configuration Spring Boot and Spring Batch...
Just set the "spring.batch.job.names=myJob" property. You could set it as SystemProperty when you launch your application (-Dspring.batch.job.names=myjob). If you have defined this property, spring-batch-starter will only launch the jobs, that are defined by this property.
To run the jobs you like from the main method you can load the the required job configuration bean and the JobLauncher from the application context and then run it:
#ComponentScan
#EnableAutoConfiguration
public class ApplicationWithJobLauncher {
public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, InterruptedException {
Log log = LogFactory.getLog(ApplicationWithJobLauncher.class);
SpringApplication app = new SpringApplication(ApplicationWithJobLauncher.class);
app.setWebEnvironment(false);
ConfigurableApplicationContext ctx= app.run(args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
JobParameters jobParameters = new JobParametersBuilder()
.addDate("date", new Date())
.toJobParameters();
if("1".equals(args[0])){
//addNewPodcastJob
Job addNewPodcastJob = ctx.getBean("addNewPodcastJob", Job.class);
JobExecution jobExecution = jobLauncher.run(addNewPodcastJob, jobParameters);
} else {
jobLauncher.run(ctx.getBean("newEpisodesNotificationJob", Job.class), jobParameters);
}
System.exit(0);
}
}
What was causing my lots of confusion was that the second job were executed, even though the first job seemed to be "picked up" by the runner... Well the problem was that in both job's configuration file I used standard method names writer(), reader(), processor() and step() and it used the ones from the second job that seemed to "overwrite" the ones from the first job without any warnings...
I used though an application config class with #EnableBatchProcessing(modular=true), that I thought would be used magically by Spring Boot :
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
I will write a blog post about it when it is ready, but until then the code is available at https://github.com/podcastpedia/podcastpedia-batch (work/learning in progress)..
There is the CommandLineJobRunner and maybe can be helpful.
From its javadoc
Basic launcher for starting jobs from the command line
Spring Batch auto configuration is enabled by adding #EnableBatchProcessing (from Spring Batch) somewhere in your context. By default it executes all Jobs in the application context on startup (see JobLauncherCommandLineRunner for details). You can narrow down to a specific job or jobs by specifying spring.batch.job.names (comma separated job name patterns).
-- Spring Boot Doc
Or disable the auto execution and run the jobs programmatically from the context using a JobLauncher based on the args passed to the main method

Resources