I have created batch application which do chunk processing. I am creating chunks using Completion Policy.
Following is my batch configuration, (keeping code minimal, please let me know if need other information)
#Bean
public Job myJob() {
ItemReader itemReader = itemReader();
return jobBuilder.get("job").start(myStep(itemReader, completionPolicyReader(itemReader), writer(), processor()));
}
#Bean
public Step myStep(ItemReader itemReader, MyCompletionPolicy completionPolicyReader, ItemWriter writer, ItemProcessor processor) {
return stepBuilder.get("step").chunk(completionPolicyReader).reader(completionPolicyReader).processor(processor).writer(writer).listener(itemReader).build(); // registered delegated itemReader to listener.
}
#Bean
public MyCompletionPolicy completionPolicyReader(ItemReader itemReader) {
MyCompletionPolicy obj = new MyCompletionPolicy();
obj.setDelegate(itemReader);
return obj;
}
#Bean
public ItemReader itemReader() {
abc === xyz ? new AReader() : new BReader();
}
// other config
Following is my MyCompletionPolicy which delegates to actual ItemReader ie either AReader or BReader depending on some condition.
class MyCompletionPolicy extends
CompletionPolicySupport implements ItemReader<MyModel>, StepExecutionListener {
public void setDelegate(ItemReader<MyModel> itemReader) {
this.itemReader = itemReader;
this.delegate = new SingleItemPeekableItemReader<MyModel>();
this.delegate.setDelegate(itemReader);
}
#Override
public MyModel read() {
currentReadItem = delegate.read(); // Here I am delegating to actual reader (ex AReader) where I cannot get `StepExecution`
return currentReadItem;
}
.... // Other overridden methods
}
Following is my AReader where I am not able to get StepExecution
class AReader implements ItemReader<MyModel>, StepExecutionListener {
#Override
public void beforeStep(StepExecution stepExecution) {
// stepExecution is NULL
}
.... // other overridden methods
}
How I can get stepExecution in my delegated ItemReader ie in AReader.
======EDIT=====
Sub question regarding best practices. If I want to increment count between chunks i.e for example between multiple calls of ItemReader and use current value of counter in ItemReader. Is it good practice to Create class field in ItemReader class or should I store it in ExecutionContext ?
Considering SingleThread App
Considering MultiThread App
By default, Spring Batch will automatically register your reader/processor/writer as listeners if they implement StepExecutionListener. In your case, the reader is MyCompletionPolicy which implements StepExecutionListener and will be registered as a listener automatically.
However, Spring Batch is not aware that your MyCompletionPolicy delegates to another reader, so you need to explicitly register your delegate as a listener in the step.
Related
[New to Spring Batch] I have different csv of different format, there can be more csv added in future so I thought of having a common FlatFileItemReader<T> instead of defining #Bean for each csv format, I created a base configuration class then concrete class for each csv type.
Since I have defined Reader bean as #StepScope , during batch job runtime it auto-initializes bean with the first concrete class in the package, same kind of problem is discussed here but answer is not relevant to my case
How do I pass particular concrete class type of ItemReader to my step during job run?
Here is my base configuration class:
public abstract class AbstractBatchItemReader<T> {
private CsvInformation csvInformation;
protected AbstractBatchItemReader(CsvInformation csvInformation) {
this.csvInformation = csvInformation;
}
#Bean
#StepScope
//fileName is retrieved from jobParameters during runtime
public FlatFileItemReader<T> getItemReader(#Value("#{jobParameters['input.file.name']}") String fileName) {
return new FlatFileItemReaderBuilder<T>()
.name("invoiceHeaderItemReader")
.resource(new FileSystemResource(fileName))
.linesToSkip(1)
.delimited()
.names(csvInformation.getHeaders().split(","))
.fieldSetMapper(new BeanWrapperFieldSetMapper<T>() {{
setConversionService(new StringToLocalDateConversion().convert());
setTargetType(csvInformation.getERPClass());
}})
.build();
}
}
Here is the concrete class that extends the base config:
#Configuration
public class InvoiceHeaderReader extends AbstractBatchItemReader<ERPInvoiceHeader> {
protected InvoiceHeaderReader(InvoiceHeaderCsvInformation csvInformation) {
super(csvInformation);
}
}
Here is my base step config:
public abstract class AbstractBatchStep<T> {
private final AbstractBatchItemReader<T> reader;
private final AbstractBatchItemWriter<T> writer;
private final StepBuilderFactory stepBuilderFactory;
protected AbstractBatchStep(AbstractBatchItemReader<T> reader,
AbstractBatchItemWriter<T> writer,
StepBuilderFactory stepBuilderFactory) {
this.reader = reader;
this.writer = writer;
this.stepBuilderFactory = stepBuilderFactory;
}
public Step getStep() {
afterPropertiesSet();
return stepBuilderFactory.get("Batch Step")
.<T, T>chunk(BatchConfiguration.READER_CHUNK_SIZE)
//fileName is passed during runtime
.reader(reader.getItemReader(null))
.writer(writer.getItemWriter())
.build();
}
}
Here is the concrete class that extends step config:
#Configuration("invoice_header")
public class InvoiceHeaderStep extends AbstractBatchStep<ERPInvoiceHeader> {
protected InvoiceHeaderStep(InvoiceHeaderReader reader, InvoiceHeaderWriter writer, StepBuilderFactory stepBuilderFactory) {
super(reader, writer, stepBuilderFactory);
}
}
The whole Job cycle runs only for the first concrete class in the package if I try to run another type of csv it fails with exception.. Unexpected token required n found n exception is obviously because the reader was auto initialized by first class in package, not the one that I pass to Step
Please also suggest if this design pattern is correct of there could possibly be an easy way to achieve this.
I would like to post answer as a reference to others.
I created an AbstractBatchItemReader<T> class which has base configuration
Concrete classes that extends base config class TypeOneCsvReader extends AbstractBatchItemReader<TypeOneEntity>
3.Interface with Csv Information methods and Classes implementing interface for each Csv type
Here is the code sample:
AbstractBatchItemReader:
public abstract class AbstractBatchItemReader<T> {
private CsvInformation csvInformation;
protected AbstractBatchItemReader(CsvInformation csvInformation) {
this.csvInformation = csvInformation;
}
FlatFileItemReader<T> getItemReader() {
return new FlatFileItemReaderBuilder<T>()
.name("Batch Reader")
.resource(resource(null))
.linesToSkip(1)
.delimited()
.quoteCharacter(BatchConfiguration.READER_QUOTE_CHARACTER)
.names(csvInformation.getHeaders().split(","))
.fieldSetMapper(new BeanWrapperFieldSetMapper<T>() {{
setConversionService(StringToLocalDateConversion.convert());
setTargetType(csvInformation.getEntityClass());
}})
.build();
}
#Bean
#StepScope
public Resource resource(#Value("#{jobParameters['input.file.name']}") String fileName) {
return new FileSystemResource(fileName);
}
}
Concrete Class:
#Configuration
public class TypeOneCsvReader extends AbstractBatchItemReader<TypeOneEntity> {
protected TypeOneCsvReader(TypeOneCsv csvInformation) {
super(csvInformation);
}
}
CsvInformation Interface:
public interface CsvInformation {
String getHeaders();
Class getEntityClass();
}
Each implementation of interface has to be annotated with #Component so that Concrete Reader class picks it up via DI
Benefit of having such an approach is, it can be scaled to as many csv type as required and also the Reader logic stays at one place
Thanks
I want to measure time of sql execution which will be run by MyBatis (Spring Boot project) and bind that with other request parameters, so I can get full info about performance issues regarding specific requests. For that case I have used MyBatis Interceptor on following way:
#Intercepts({
#Signature(
type = Executor.class,
method = "query",
args = {MappedStatement.class, Object.class, RowBounds.class, ResultHandler.class, CacheKey.class, BoundSql.class}),
#Signature(
type = Executor.class,
method = "query",
args = {MappedStatement.class, Object.class, RowBounds.class, ResultHandler.class})
})
public class QueryMetricsMybatisPlugin implements Interceptor {
#Override
public Object intercept(Invocation invocation) throws Throwable {
Stopwatch stopwatch = Stopwatch.createStarted();
Object result = invocation.proceed();
stopwatch.stop();
logExectionTime(stopwatch, (MappedStatement) invocation.getArgs()[0]);
return result;
}
}
Now when it come to binding with request, I want to store those metrics in request as attribute. I have tried this simple solution to get request, but that was not working since request was always null (I have read that this solution won't work in async methods, but with MyBatis Interceptor and its methods I think that's not the case):
#Autowired
private HttpServletRequest request;
So, the question is how properly get request within MyBatis interceptor?
One important note before I answer your question: it is a bad practice to access UI layer in the DAO layer. This creates dependency in the wrong direction. Outer layers of your application can access inner layers but in this case this is other way round. Instead of this you need to create a class that does not belong to any layer and will (or at least may) be used by all layers of the application. It can be named like MetricsHolder. Interceptor can store values to it, and in some other place where you planned to get metrics you can read from it (and use directly or store them into request if it is in UI layer and request is available there).
But now back to you question. Even if you create something like MetricsHolder you still will face the problem that you can't inject it into mybatis interceptor.
You can't just add a field with Autowired annotation to interceptor and expect it to be set. The reason for this is that interceptor is instantiated by mybatis and not by spring. So spring does not have chance to inject dependencies into interceptor.
One way to handle this is to delegate handling of the interception to a spring bean that will be part of the spring context and may access other beans there. The problem here is how to make that bean available in interceptor.
This can be done by storing a reference to such bean in the thread local variable. Here's example how to do that. First create a registry that will store the spring bean.
public class QueryInterceptorRegistry {
private static ThreadLocal<QueryInterceptor> queryInterceptor = new ThreadLocal<>();
public static QueryInterceptor getQueryInterceptor() {
return queryInterceptor.get();
}
public static void setQueryInterceptor(QueryInterceptor queryInterceptor) {
QueryInterceptorRegistry.queryInterceptor.set(queryInterceptor);
}
public static void clear() {
queryInterceptor.remove();
}
}
Query interceptor here is something like:
public interface QueryInterceptor {
Object interceptQuery(Invocation invocation) throws InvocationTargetException, IllegalAccessException;
}
Then you can create an interceptor that will delegate processing to spring bean:
#Intercepts({
#Signature(type = Executor.class, method = "query", args = { MappedStatement.class, Object.class,
RowBounds.class, ResultHandler.class }),
#Signature(type = Executor.class, method = "query", args = { MappedStatement.class, Object.class,
RowBounds.class, ResultHandler.class, CacheKey.class, BoundSql.class}) })
public class QueryInterceptorPlugin implements Interceptor {
#Override
public Object intercept(Invocation invocation) throws Throwable {
QueryInterceptor interceptor = QueryInterceptorRegistry.getQueryInterceptor();
if (interceptor == null) {
return invocation.proceed();
} else {
return interceptor.interceptQuery(invocation);
}
}
#Override
public Object plugin(Object target) {
return Plugin.wrap(target, this);
}
#Override
public void setProperties(Properties properties) {
}
}
You need to create an implementation of the QueryInterceptor that does what you need and make it a spring bean (that's where you can access other spring bean including request which is a no-no as I wrote above):
#Component
public class MyInterceptorDelegate implements QueryInterceptor {
#Autowired
private SomeSpringManagedBean someBean;
#Override
public Object interceptQuery(Invocation invocation) throws InvocationTargetException, IllegalAccessException {
// do whatever you did in the mybatis interceptor here
// but with access to spring beans
}
}
Now the only problem is to set and cleanup the delegate in the registry.
I did this via aspect that was applied to my service layer methods (but you can do it manually or in spring mvc interceptor). My aspect looks like this:
#Aspect
public class SqlSessionCacheCleanerAspect {
#Autowired MyInterceptorDelegate myInterceptorDelegate;
#Around("some pointcut that describes service methods")
public Object applyInterceptorDelegate(ProceedingJoinPoint proceedingJoinPoint) throws Throwable {
QueryInterceptorRegistry.setQueryInterceptor(myInterceptorDelegate);
try {
return proceedingJoinPoint.proceed();
} finally {
QueryInterceptorRegistry.clear();
}
}
}
I'm connecting to PubNub in a Spring Boot application. From the documentation, it's ok to re-use PubNub objects but it's better to have one per thread. What's the appropriate method to store and retrieve one object per thread in Spring Boot?
This is how you'd store and retrieve an object per thread in Spring using ThreadLocal, this example is based on Spring's own ThreadLocalSecurityContextHolderStrategy which is used to store SecurityContext per thread.
Also, take a look at InheritableThreadLocal especially if your code spins up new thread, e.g. Spring's #Async annotation, it has mechanisms to propagate existing or create new thread local values when creating child threads.
import org.springframework.util.Assert;
final class ThreadLocalPubNubHolder {
private static final ThreadLocal<PubNub> contextHolder = new ThreadLocal<PubNub>();
public void clearContext() {
contextHolder.remove();
}
public PubNub getContext() {
PubNub ctx = contextHolder.get();
if (ctx == null) {
ctx = createEmptyContext();
contextHolder.set(ctx);
}
return ctx;
}
public void setContext(PubNub context) {
Assert.notNull(context, "Only non-null PubNub instances are permitted");
contextHolder.set(context);
}
public PubNub createEmptyContext() {
// TODO - insert code for creating a new PubNub object here
return new PubNubImpl();
}
}
You can use Java ThreadLocal support as mentioned above by #SergeyB. Another way to do it is to use Thread Scope for your beans:
#Configuration
public class AppConfig {
//Register thread scope for your application
#Bean
public BeanFactoryPostProcessor beanFactoryPostProcessor() {
return beanFactory -> beanFactory.registerScope("thread", new SimpleThreadScope());
}
}
Then you can create a bean with a thread scope (proxy mode will be explained below):
#Scope(value = "thread", proxyMode = ScopedProxyMode.TARGET_CLASS)
#Component
public class PubSubContext {
private PubSub pubSub;
public PubSub getPubSub() {
return pubSub;
}
public void setPubSub(PubSub pubSub) {
this.pubSub = pubSub;
}
#PostConstruct
private void init() {
// TODO: your code for initializing PubSub object
log.info("RequiredMessageHeaders started in thread " + Thread.currentThread().getId());
}
#PreDestroy
private void destroy() {
// TODO: your code for cleaning resources if needed
log.info("RequiredMessageHeaders destroyed in thread " + Thread.currentThread().getId());
}
}
The last step is to inject PubSubContext where you need it:
#Controller
public class YourController {
// Spring will inject here different objects specific for each thread.
// Note that because we marked PubSubContext with proxyMode = ScopedProxyMode.TARGET_CLASS we do not need to use applicationContext.get(PubSubContext.class) to obtain a new bean for each thread - it will be handled by Spring automatically.
#Autowired
private PubSubContext pubSubContext;
#GetMapping
public String yourMethod(){
...
PubSub pubSub = pubSubContext.getPubSub();
...
}
}
With this approach, you could go even further and mark your PubSubContext as #Lazy, so it won't be created until it's requested inside yourMethod :
#Controller
public class YourController {
#Lazy
#Autowired
private PubSubContext pubSubContext;
...
}
As you see PubSubContext does basically what ThreadLocal does but leveraged by Spring capabilities.
Hope it helps!
First of all,
As it is safe to use single PubNub object in multiple threads,
You need multiple PubNub objects ONLY if you need performance increase
If that is your case - my suggestion will be to organize pool of PubNub objects (the use case is quite close to DB connection use case).
I'm currently using Spring Batch to run a job that processes a file, does some stuff on each line and writes the output to another file.
This was developed in a 'core' product but now (as always) we have some client-specific requirements that mandate the inclusion of some extra steps in the job.
I've been able to do a proof-of-concept where use the common Spring features to be able to 'replace' the job with another one with the extra steps either by using distinct names for the job (if we define them in the same Configuration class) or by creating a completely distinct Configuration class and loading that as the Spring context.
What i'm asking is, and i'm 'almost' there, if it's possible to easily define a base Job (maybe with an initial step or not) and then only add the steps that make sense for that specific 'client'.
I'm using standard class inheritance to do this but it doesn't work properly with standard Spring facilities since Spring won't know which implementation of the "getSteps" method to use (code below).
abstract class JobConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
protected StepBuilderFactory stepBuilderFactory;
#Bean
Job job() {
List<Step> steps = getSteps();
final JobBuilder jobBuilder = jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer());
SimpleJobBuilder builder = jobBuilder.start(steps.remove(0));
for (Step s : steps) {
builder = builder.next(s);
}
return builder.build();
}
protected abstract List<Step> getSteps();
}
#Configuration
#Import(BaseConfig.class)
public class Client1JobConfig extends JobConfig {
#Override
protected List<Step> getSteps() {
List<Step> steps = new ArrayList<>();
steps.add(step1());
return steps;
}
Step step1() {
return stepBuilderFactory.get("step1")
.<Integer, Integer>chunk(1)
.reader(dummyReader())
.processor(processor1())
.writer(dummyWriter())
.build();
}
}
#Configuration
#Import(BaseConfig.class)
public class Client2JobConfig extends JobConfig {
#Override
protected List<Step> getSteps() {
List<Step> steps = new ArrayList<>();
steps.add(step1());
steps.add(step2());
return steps;
}
Step step1() {
return stepBuilderFactory.get("step1")
.<Integer, Integer>chunk(1)
.reader(dummyReader())
.processor(processor1())
.writer(dummyWriter())
.build();
}
Step step2() {
return stepBuilderFactory.get("step2")
.<Integer, Integer>chunk(1)
.reader(dummyReader())
.processor(processor2())
.writer(dummyWriter())
.build();
}
}
I can make it work if i load just one Configuration class into the Spring context but if i have all the Configuration classes loaded (either by component scanning, or manually adding them to the context) of course it doesn't work because there's no way to select wither one implementation of the other.
I can also make it work by having differently-named jobs like "client1" and "client2" but let's say i can't change the calling code and the job is Autowired. How can i have the 'same' some but with different steps?
Is there a better way to accomplish this?
Is it possible to access beans defined outside of the step scope? For example, if I define a strategy "strategyA" and pass it in the job parameters I would like the #Value to resolve to the strategyA bean. Is this possible? I am currently working round the problem by getting the bean manually from the applicationContext.
#Bean
#StepScope
public Tasklet myTasklet(
#Value("#{jobParameters['strategy']}") MyCustomClass myCustomStrategy)
MyTasklet myTasklet= new yTasklet();
myTasklet.setStrategy(myCustomStrategy);
return myTasklet;
}
I would like to have the ability to add more strategies without having to modify the code.
The sort answer is yes. This is more general spring/design pattern issue rater then Spring Batch.
The Spring Batch tricky parts are the configuration and understanding scope of bean creation.
Let’s assume all your Strategies implement Strategy interface that looks like:
interface Strategy {
int execute(int a, int b);
};
Every strategy should implements Strategy and use #Component annotation to allow automatic discovery of new Strategy. Make sure all new strategy will placed under the correct package so component scan will find them.
For example:
#Component
public class StrategyA implements Strategy {
#Override
public int execute(int a, int b) {
return a+b;
}
}
The above are singletons and will be created on the application context initialization.
This stage is too early to use #Value("#{jobParameters['strategy']}") as JobParameter wasn't created yet.
So I suggest a locator bean that will be used later when myTasklet is created (Step Scope).
StrategyLocator class:
public class StrategyLocator {
private Map<String, ? extends Strategy> strategyMap;
public Strategy lookup(String strategy) {
return strategyMap.get(strategy);
}
public void setStrategyMap(Map<String, ? extends Strategy> strategyMap) {
this.strategyMap = strategyMap;
}
}
Configuration will look like:
#Bean
#StepScope
public MyTaskelt myTasklet () {
MyTaskelt myTasklet = new MyTaskelt();
//set the strategyLocator
myTasklet.setStrategyLocator(strategyLocator());
return myTasklet;
}
#Bean
protected StrategyLocator strategyLocator(){
return = new StrategyLocator();
}
To initialize StrategyLocator we need to make sure all strategy were already created. So the best approach would be to use ApplicationListener on ContextRefreshedEvent event (warning in this example strategy names start with lower case letter, changing this is easy...).
#Component
public class PlugableStrategyMapper implements ApplicationListener<ContextRefreshedEvent> {
#Autowired
private StrategyLocator strategyLocator;
#Override
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent) {
ApplicationContext applicationContext = contextRefreshedEvent.getApplicationContext();
Map<String, Strategy> beansOfTypeStrategy = applicationContext.getBeansOfType(Strategy.class);
strategyLocator.setStrategyMap(beansOfTypeStrategy);
}
}
The tasklet will hold a field of type String that will be injected with Strategy enum String using #Value and will be resolved using the locator using a "before step" Listener.
public class MyTaskelt implements Tasklet,StepExecutionListener {
#Value("#{jobParameters['strategy']}")
private String strategyName;
private Strategy strategy;
private StrategyLocator strategyLocator;
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
strategy = strategyLocator.lookup(strategyName);
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
int executeStrategyResult = strategy.execute(1, 2);
}
public void setStrategyLocator(StrategyLocator strategyLocator) {
this.strategyLocator = strategyLocator;
}
}
To attach the listener to the taskelt you need to set it in your step configuration:
#Bean
protected Step myTaskletstep() throws MalformedURLException {
return steps.get("myTaskletstep")
.transactionManager(transactionManager())
.tasklet(deleteFileTaskelt())
.listener(deleteFileTaskelt())
.build();
}
jobParameters is holding just a String object and not the real object (and I think is not a good pratice store a bean definition into parameters).
I'll move in this way:
#Bean
#StepScope
class MyStategyHolder {
private MyCustomClass myStrategy;
// Add get/set
#BeforeJob
void beforeJob(JobExecution jobExecution) {
myStrategy = (Bind the right strategy using job parameter value);
}
}
and register MyStategyHolder as listener.
In your tasklet use #Value("#{MyStategyHolder.myStrategy}") or access MyStategyHolder instance and perform a getMyStrategy().