I have a Job running on startup. I want to run this job programmatically at a particular point of my application, not when I start my app.
When running on startup I have no problem, but I got a "NoSuchJobException" (No job configuration with the name [importCityFileJob] was registered) when I try to run it programmatically.
After looking on the web, I think it's a problem related to JobRegistry, but I don't know how to solve it.
Note : my whole batch configuration is set programmatically, I don't use any XML file to configure my batch and my job. That's a big part of my problem while I lack the examples...
Here is my code to run the Job :
public String runBatch() {
try {
JobLauncher launcher = new SimpleJobLauncher();
JobLocator locator = new MapJobRegistry();
Job job = locator.getJob("importCityFileJob");
JobParameters jobParameters = new JobParameters(); // ... ?
launcher.run(job, jobParameters);
} catch (Exception e) {
e.printStackTrace();
System.out.println("Something went wrong");
}
return "Job is running";
}
My Job declaration :
#Bean
public Job importCityFileJob(JobBuilderFactory jobs, Step step) {
return jobs.get("importFileJob").incrementer(new RunIdIncrementer()).flow(step).end().build();
}
(I tried to replace importCityFileJob by importFileJob in my runBatch method, but it didn't work)
My BatchConfiguration file contains the job declaration above, a step declaration, the itemReader/itemWriter/itemProcessor, and that's all.
I use the #EnableBatchProcessing annotation.
I'm new to Spring Batch & I'm stuck on this problem. Any help would be welcome.
Thanks
Edit : I've solved my problem. I wrote my solution in the answers
Here is what I had to do to fix my problem:
Add the following Bean to the BatchConfiguration :
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor = new JobRegistryBeanPostProcessor();
jobRegistryBeanPostProcessor.setJobRegistry(jobRegistry);
return jobRegistryBeanPostProcessor;
}
Replace the JobLocator by an #Autowired JobRegistry, and use the #Autowired JobLauncher instead of creating one. My run method now have the following code :
#Autowired
private JobRegistry jobRegistry;
#Autowired
private JobLauncher launcher;
public String runBatch() {
try {
Job job = jobRegistry.getJob("importCityFileJob");
JobParameters jobParameters = new JobParameters();
launcher.run(job, jobParameters);
} catch (Exception e) {
e.printStackTrace();
System.out.println("Something went wrong");
}
return "OK";
}
I hope it will help someone.
A JobRegistry won't populate itself. In your example, you're creating a new instance, then trying to get the job from it without having registered it in the first place. Typically, the JobRegistry is configured as a bean along with an AutomaticJobRegistrar that will load all jobs into the registrar on startup. That doesn't mean they will be executed, just registered so they can be located later.
If you're using Java configuration, this should happen automatically using the #EnableBatchProcessing annotation. With that annotation, you'd just inject the provided JobRegistry and the jobs should already be there.
You can read more about the #EnableBatchProcessing in the documentation here: http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html
You can also read about the AutomaticJobRegistrar in the documentation here: http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/support/AutomaticJobRegistrar.html
I could not find the correct answer on this page. In my case the spring batch jobs were configured in a different configuration class not annotated with #EnableBatchProcessing. In that case you need to add the Job to the JobRegistry:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.configuration.DuplicateJobException;
import org.springframework.batch.core.configuration.JobRegistry;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.support.ReferenceJobFactory;
import org.springframework.batch.core.job.flow.Flow;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MyBatchJobConfigurations {
#Bean
public Job myCountBatchJob(final JobBuilderFactory jobFactory, final JobRegistry jobRegistry, final Flow myJobFlow)
throws DuplicateJobException {
final Job countJob = jobFactory.get("myCountBatchJob")
.start(myJobFlow)
.end().build();
ReferenceJobFactory referenceJobFactory = new ReferenceJobFactory(countJob);
jobRegistry.register(referenceJobFactory);
return countJob;
}
}
Adding the following bean in the applicationContext.xml resolved the problem for me
<bean class="org.springframework.batch.core.configuration.support.JobRegistryBeanPostProcessor">
<property name="jobRegistry" ref="jobRegistry" />
</bean>
I also have this entry in the applicationContext.xml
<bean id="jobRegistry"
class="org.springframework.batch.core.configuration.support.MapJobRegistry" />
another solution:
rename the method name "importCityFileJob" to "job":
#Bean
public Job job(JobBuilderFactory jobs, Step step) {
return jobs.get("importFileJob").incrementer(new RunIdIncrementer()).flow(step).end().build();
}
#EnableBatchProcessing
#Configuration
public class SpringBatchCommon {
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
JobRegistryBeanPostProcessor postProcessor = new JobRegistryBeanPostProcessor();
postProcessor.setJobRegistry(jobRegistry);
return postProcessor;
}
}
Set the JobRegistry in the JobRegistryBeanPostProcessor , after that you can autowire the JobLauncher and the JobLocator
Job job = jobLocator.getJob("importFileJob");
JobParametersBuilder jobBuilder = new JobParametersBuilder();
//set any parameters if required
jobLauncher.run(job, jobBuilder.toJobParameters()
Related
I need your help please:
i have a spring batch app that is running perfectly with the main job and step as shown below :
#Bean
public Job JobFinal(Step step1) {
return jobBuilderFactory
.get("JobFinal")
.incrementer(new RunIdIncrementer())
.start(step1)
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<A, B>chunk(2)
.reader(readerDB())
.processor(process())
.writer(writerCS())
.build();
}
this job is configured in a class "BatchConfig" :
And here is my main :
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
I want to add quartz configuration to run the job everyday at midnight.
I couldn't find helpful tutorial to understand how to configure quartz in my case and in which class exactly !
Thank you for helpp :)
You would need to use SchedulerFactory for automatically triggering the jobs using Quartz like below:
SchedulerFactory sf = new StdSchedulerFactory();
Scheduler sche = sf.getScheduler();
JobDetail job = newJob(myclass.class).withIdentity("myid", "myname").build();
CronTrigger trigger = newTrigger().withIdentity("mytriggerid", "myname").withSchedule(cronSchedule("0 0 * * *"))
.build();
sche.scheduleJob(job, trigger);
sche.start();
Official documentation:
http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html
I have two jobs. I am trying to test 1 single job.
this is what I am trying:
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Autowired
#Qualifier("jobNumber1")
private Job job;
#Test
public void test() {
try {
jobLauncherTestUtils
.getJobLauncher()
.run(job, new JobParametersBuilder()
.addString("--spring.batch.job.names", "jobNumber1")
.toJobParameters());
} catch (Exception e) {
e.printStackTrace();
}
}
But when I see logs, it is running both jobs. How do I make it test only 1 test? Thanks
I have also tried to add a Job in in JobLauncherTestUtils
#Bean
public JobLauncherTestUtils jobLauncherTestUtils() throws Exception {
return new JobLauncherTestUtils() {
#Override
#Autowired
public void setJob(#Qualifier("jobNumber1") Job job) {
super.setJob(job);
}
};
}
and do jobLauncherTestUtils.launchJob(). Still both jobs are running.
You are passing a Spring Boot parameter (--spring.batch.job.names) as a Spring Batch parameter. So Spring Boot is not aware of it and will still run both jobs. You need to either:
pass the --spring.batch.job.names=jobNumber1 to the command line you are using to test your job
or add the spring.batch.job.names=jobNumber1 in the application.properties file of your test resources
Hope this helps.
I originally asked a question at [Getting "Scope 'step' is not active for the current thread" while creating spring batch beans, and based on the suggestion provided at Spring batch scope issue while using spring boot, I tried to replace #StepScope annotation and instead defined StepScope bean in configuration as below
#Bean
#Qualifier("stepScope")
public org.springframework.batch.core.scope.StepScope stepScope() {
final org.springframework.batch.core.scope.StepScope stepScope = new org.springframework.batch.core.scope.StepScope();
stepScope.setAutoProxy(true);
return stepScope;
}
with this change, I'm not able to pass job parameters while creating beans as, it is throwing
'jobParameters' cannot be found on object of type 'org.springframework.beans.factory.config.BeanExpressionContext'
My configuration is like
#Configuration
#EnableBatchProcessing
public class MyConfig{
#Bean
#Qualifier("partitionJob")
public Job partitionJob() throws Exception {
return jobBuilderFactory
.get("partitionJob")
.incrementer(new RunIdIncrementer())
.start(partitionStep(null))
.build();
}
#Bean
#StepScope
public Step partitionStep(
#Value("#{jobParameters[gridSize]}") String gridSize)
throws Exception {
return stepBuilderFactory
.get("partitionStep")
.partitioner("slaveStep", partitioner())
.gridSize(gridZize)
.step(slaveStep(chunkSize))
.taskExecutor(threadPoolTaskExecutor()).build();
}
#Bean
#StepScope
public Step slaveStep(int chunkSize) throws Exception {
return stepBuilderFactory
.get("slaveStep")
......
........
}
I read that the bean should be annotated with #StepScope,if job Parameters needs to be accessed like my example. But I'm getting exceptions as explained above.
This is how you can pass job parameters
GenericApplicationContext context = new AnnotationConfigApplicationContext(MyConfiguration.class);
JobLauncher jobLauncher = (JobLauncher) context.getBean("jobLauncher");
JobParameters jobParameters = new JobParametersBuilder()
.addString("paramter1", "Test")
.toJobParameters();
Job job = (Job) context.getBean("myJob");
JobExecution execution = jobLauncher.run(job, jobParameters);
Access Job parameters
Bean
#StepScope
public Step partitionStep(
#Value("#{jobParameters['paramter1']}") String gridSize)
throws Exception {
return stepBuilderFactory
.get("partitionStep")
.partitioner("slaveStep", partitioner())
.gridSize(gridZize)
.step(slaveStep(chunkSize))
.taskExecutor(threadPoolTaskExecutor()).build();
}
I am trying to run my batch job from a controller. It will be either fired up by a cron job or by accessing a specific link.
I am using Spring Boot, no XML just annotations.
In my current setting I have a service that contains the following beans:
#EnableBatchProcessing
#PersistenceContext
public class batchService {
#Bean
public ItemReader<Somemodel> reader() {
...
}
#Bean
public ItemProcessor<Somemodel, Somemodel> processor() {
return new SomemodelProcessor();
}
#Bean
public ItemWriter writer() {
return new CustomItemWriter();
}
#Bean
public Job importUserJob(JobBuilderFactory jobs, Step step1) {
return jobs.get("importUserJob")
.incrementer(new RunIdIncrementer())
.flow(step1)
.end()
.build();
}
#Bean
public Step step1(StepBuilderFactory stepBuilderFactory,
ItemReader<somemodel> reader,
ItemWriter<somemodel> writer,
ItemProcessor<somemodel, somemodel> processor) {
return stepBuilderFactory.get("step1")
.<somemodel, somemodel> chunk(100)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
}
As soon as I put the #Configuration annotation on top of my batchService class, job will start as soon as I run the application. It finished successfully, everything is fine. Now I am trying to remove #Configuration annotation and run it whenever I want. Is there a way to fire it from the controller?
Thanks!
You need to create a application.yml file in the src/main/resources and add following configuration:
spring.batch.job.enabled: false
With this change, the batch job will not automatically execute with the start of Spring Boot. And batch job will be triggered when specific link.
Check out my sample code here:
https://github.com/pauldeng/aws-elastic-beanstalk-worker-spring-boot-spring-batch-template
You can launch a batch job programmatically using JobLauncher which can be injected into your controller. See the Spring Batch documentation for more details, including this example controller:
#Controller
public class JobLauncherController {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
#RequestMapping("/jobLauncher.html")
public void handle() throws Exception{
jobLauncher.run(job, new JobParameters());
}
}
Since you're using Spring Boot, you should leave the #Configuration annotation in there and instead configure your application.properties to not launch the jobs on startup. You can read more about the autoconfiguration options for running jobs at startup (or not) in the Spring Boot documentation here: http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#howto-execute-spring-batch-jobs-on-startup
I have two independent spring batch jobs in the same project because I want to use the same infrastructure-related beans. Everything is configured in Java. I would like to know if there's a proper way to start the jobs independent based for example on the first java app argument in the main method for example. If I run SpringApplication.run only the second job gets executed by magic.
The main method looks like:
#ComponentScan
#EnableAutoConfiguration
public class Application {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(Application.class);
app.setWebEnvironment(false);
ApplicationContext ctx= app.run(args);
}
}
and the two jobs are configured as presented in the Spring Batch Getting Started tutorial on Spring.io. Here is the configuration file of the first job, the second being configured in the same way.
#Configuration
#EnableBatchProcessing
#Import({StandaloneInfrastructureConfiguration.class, ServicesConfiguration.class})
public class AddPodcastJobConfiguration {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
//reader, writer, processor...
}
To enable modularization I created an AppConfig class, where I define factories for the two jobs:
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
P.S. I am new to Spring configuration in Java configuration Spring Boot and Spring Batch...
Just set the "spring.batch.job.names=myJob" property. You could set it as SystemProperty when you launch your application (-Dspring.batch.job.names=myjob). If you have defined this property, spring-batch-starter will only launch the jobs, that are defined by this property.
To run the jobs you like from the main method you can load the the required job configuration bean and the JobLauncher from the application context and then run it:
#ComponentScan
#EnableAutoConfiguration
public class ApplicationWithJobLauncher {
public static void main(String[] args) throws BeansException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, InterruptedException {
Log log = LogFactory.getLog(ApplicationWithJobLauncher.class);
SpringApplication app = new SpringApplication(ApplicationWithJobLauncher.class);
app.setWebEnvironment(false);
ConfigurableApplicationContext ctx= app.run(args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
JobParameters jobParameters = new JobParametersBuilder()
.addDate("date", new Date())
.toJobParameters();
if("1".equals(args[0])){
//addNewPodcastJob
Job addNewPodcastJob = ctx.getBean("addNewPodcastJob", Job.class);
JobExecution jobExecution = jobLauncher.run(addNewPodcastJob, jobParameters);
} else {
jobLauncher.run(ctx.getBean("newEpisodesNotificationJob", Job.class), jobParameters);
}
System.exit(0);
}
}
What was causing my lots of confusion was that the second job were executed, even though the first job seemed to be "picked up" by the runner... Well the problem was that in both job's configuration file I used standard method names writer(), reader(), processor() and step() and it used the ones from the second job that seemed to "overwrite" the ones from the first job without any warnings...
I used though an application config class with #EnableBatchProcessing(modular=true), that I thought would be used magically by Spring Boot :
#Configuration
#EnableBatchProcessing(modular=true)
public class AppConfig {
#Bean
public ApplicationContextFactory addNewPodcastJobs(){
return new GenericApplicationContextFactory(AddPodcastJobConfiguration.class);
}
#Bean
public ApplicationContextFactory newEpisodesNotificationJobs(){
return new GenericApplicationContextFactory(NotifySubscribersJobConfiguration.class);
}
}
I will write a blog post about it when it is ready, but until then the code is available at https://github.com/podcastpedia/podcastpedia-batch (work/learning in progress)..
There is the CommandLineJobRunner and maybe can be helpful.
From its javadoc
Basic launcher for starting jobs from the command line
Spring Batch auto configuration is enabled by adding #EnableBatchProcessing (from Spring Batch) somewhere in your context. By default it executes all Jobs in the application context on startup (see JobLauncherCommandLineRunner for details). You can narrow down to a specific job or jobs by specifying spring.batch.job.names (comma separated job name patterns).
-- Spring Boot Doc
Or disable the auto execution and run the jobs programmatically from the context using a JobLauncher based on the args passed to the main method