Spring #Scheduled in multiple classes, but scheduler returns to a single class - spring-boot

I Have a multiple classes (1,2,3,4) with #Scheduled where i have different times specified for each task/Process to run. Here I have created config class with all the schedulers beans in the config but, every time it returns the call to a single class 1.
Is there anything i need to be added in the config class? Than the below one
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackages ="com.something.too.foo")
public class MyConfig {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
public MyTask1 myTask1(){
return new MyTask1();
}
#Bean
public MyTask2 myTask2(){
return new MyTask2();
}
#Bean
public MyTask3 myTask()3{
return new MyTask3();
}
#Bean
public MyTask4 myTask()4{
return new MyTask4();
}
Not sure what to be added to process all the Tasks in parallel processing. Help is very much Appreciated.

Related

How to load Spring Batch Job using JobLauncherTestUtils for tests

I want to test a job that I used to load as a SpringBootTest and SpringJunit4Runner. As I upgraded to JUnit 5 the jobLauncherTestUtils class no longer loads. The project is a Spring Batch application using Spring Boot 2.2.0.RELEASE. My main configuration is called AppConfig and I have configured the step and job as beans that I can autowire in the test class. However, the application context which used to load now longer loads even. The error tells me the job is not added to the jobLauncherTestUtils. I do not understand why the job can no longer be loaded when it could before. I'd appreciate some help in fixing this issue
src/main/com/indigo/search/config/AppConfig
#Bean
public Step orderIntakeStep() {
return stepBuilderFactory.get("orderIntakeStep")
.<Order, Order>chunk(30)
.reader(orderReader())
.processor(orderProcessor())
.writer(orderWriter())
.build();
}
#Bean(name = "orderIntakeJob")
public Job orderIntakeJob() {
return jobBuilderFactory.get("orderIntakeJob")
.incrementer(new RunIdIncrementer())
.flow(orderIntakeStep())
.end()
.build();
}
...
}
#ExtendWith(SpringExtension.class)
#SpringBatchTest
#Transactional(propagation = Propagation.NOT_SUPPORTED)
class OrderIntakeJobTest {
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Autowired
private JobRepositoryTestUtils jobRepositoryTestUtils;
#Autowired
private JobLauncher jobLauncher;
#Autowired
Job orderIntakeJob;
...
#Before
public void initJob(){
jobLauncherTestUtils.setLauncher(jobLauncher);
jobLauncherTestUtils.setJobLauncher(jobLauncher);
jobLauncherTestUtils.setJobRepository(jobRepository);
jobLauncherTestUtils.setJob(orderIntakeJob);
j
}
From what you shared, there is nothing that imports a configuration class that contains the job under test in OrderIntakeJobTest. You should have something like:
#ExtendWith(SpringExtension.class)
#SpringBatchTest
#Transactional(propagation = Propagation.NOT_SUPPORTED)
#ContextConfiguration(classes = MyJobConfiguration.class) // this is where the job under test is defined
class OrderIntakeJobTest {
// ...
}

When I add Spring batch configuration I get error

In my project I use multiple schema (multiple dataSource)
When I add Spring batch configuration I get error:No qualifying bean of type 'org.springframework.transaction.PlatformTransactionManager' available: expected single matching bean but found 5
but when I remove spring batch configuration the error is removed.
#Configuration
#EnableBatchProcessing
#Import(MyDataSourceClassConfig.class)
public class TestBatchJobConfiguration extends DefaultBatchConfigurer {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
....
}
if you also facing same problem, you need to verify two point.
First you have to do not create a bean transaction named transactionManager (this default one used by spring batch)
Second you need to override getTransactionManager to specify which transactionManager you want to use and which dataSource you want to use
#Autowired
#Qualifier("myPersonalTransactionManager")
private PlatformTransactionManager transactionManager;
#Override
public PlatformTransactionManager getTransactionManager() {
return transactionManager;
}
#Override
#Autowired
public void setDataSource(#Qualifier("thirdDataSource") DataSource batchDataSource) {
super.setDataSource(batchDataSource);
}

Spring JavaConfig + WebMvcConfigurerAdapter + #Autowired => NPE

I have an application with 2 Contexts. Parent for web agnostic business logic and ChildContext (implicitly created by dispatcher servlet) for web logic.
My setup loks like
#Configuration
public class BusinessConfig {
#Bean
public ObjectMapper jacksonMapper() { return new ObjectMapper() }
}
and
#Configuration
public class WebConfig extends WebMvcConfigurerAdapter {
#Autowired
private ObjectMapper objectMapper; // <- is null for some reason
#Override
public configureMessageConverters(List<HttpMessageConverter<?>> converters) {
MappingJackson2HttpMessageConverter converter = new MappingJackson2HttpMessageConverter();
converter.setObjectMapper(objectMapper); // <- bang!
messageConverters.add(converter);
}
}
I need the the object mapper in the parent context, as I use it also in security configuration. But can someone explain me, why the #Autowired objectMapper is null? Its created in the parent context (the fact that the parent exists is even logged by spring at startup). Also #Autowired has required=true by default, so it should not blow up in the configure method (it should have blown up in construction of the context, if the bean wasn't there for some reason).
It seems to me that there might be some lifecycle problem in spring - in a sense that it calls the overridden methods first, and then #Autowires the dependencies... I have also tried to #Autowire the BusinessConfig (should be perfectly legal according to documentation - the result was the same (null)).
What should I do to make this working?
Thanks in advance!
EDIT - ISSUE FOUND
I found the issue. Unfortunately it had nothing to do with WebMvcConfigurerAdapter nor #Configuration. It was caused by premature initialization of context triggered by missing static modifier for propertyPlaceholderConfigurer... I have created issue in Spring core jira (https://jira.spring.io/browse/SPR-14382)
What about simply renaming the bean declaration method to match with the autowired bean?
#Configuration
public class BusinessConfig {
#Bean
public ObjectMapper objectMapper() { return new ObjectMapper() }
}
#Configuration
public class WebConfig extends WebMvcConfigurerAdapter {
#Autowired
private ObjectMapper objectMapper;
[...]
}

Spring java config: bean config after component scan

I have the following configuration:
#Configuration
#ComponentScan("com.xyz.svc")
public class SvcConfig {
#Autowired private Filter filter1;
#Autowired private Filter filter2;
#Autowired private Filter filter3;
#Bean
public List<Filter> filters() {
// Filters are added in the desired order of execution
return ImmutableList.of(
filter1,
filter2,
filter3);
}
}
When leadFilters() method is run all the components that it depends on (ie filter1, filter2, filter3) are null. Basically, these components are registered through #ComponentScan. The problem is leadFilters() method is getting executed before #ComponentScan.
How do I make this work?
Basically, you can't, reliably. A #Configuration class is a #Component that is meant to register bean definitions through #Bean annotated methods. If a request for a bean (handled through a #Bean method) comes in before the BeanPostProcessor that handles #Autowired, then you will see the behavior you are describing.
Note that the following will cause you problems as Spring won't know which to inject.
#Autowired
private Filter filter1;
#Autowired
private Filter filter2;
#Autowired
private Filter filter3;
Assuming this was just an example, you could refactor so that instead of having #Component classes for these filters, you instead declare #Bean methods for them.
#Bean
public Filter filter1() {
return new FilterImpl1();
}
#Bean
public Filter filter2() {
return new FilterImpl2();
}
#Bean
public Filter filter3() {
return new FilterImpl3();
}
You can then use these beans in your other #Bean method
#Bean
public List<Filter> filters() {
// Filters are added in the desired order of execution
return ImmutableList.of(
filter1(),
filter2(),
filter3());
}

How to configure Services with Spring Data JPA repositories with spring Java Configuration

I am trying to figure out how to get a hold of the OrderRepository so that I can pass it into the constructor of the OrderServiceImpl using Spring's java configuration (I already know how to do it with xml configuration).
#Configuration
#ComponentScan(basePackages = "com.sample.app")
#EnableJpaRepositories("com.sample.app")
#EnableTransactionManagement
public class AppConfig
{
#Bean
public OrderService orderService()
{
return new OrderServiceImpl(orderRepository());
}
#Bean
public OrderRepository orderRepository()
{
return ??? What goes here ???
}
...
}
#Configuration
#ComponentScan(basePackages = "com.sample.app")
#EnableJpaRepositories("com.sample.app")
#EnableTransactionManagement
public class AppConfig {
#Autowired
private OrderRepository orderRepository;
#Bean
public OrderService orderService() {
return new OrderServiceImpl(orderRepository);
}
}
Something like that should work. Or simply put a field inside your OrderServiceImpl which is annotated with #Autowired and remove the constructor which takes an orderRepository. Or rely on component-scanning and remove the #Bean methods all together.
You have a component-scan and #Bean method, you might run into duplicate instances of your service that way (if it is annotated with #Service).

Resources