Spring Boot - Any shortcuts for setting TaskExecutor? - spring-boot

Just wanted to check in to see if anyone had a faster way to set the TaskExecutor for Spring MVC within spring boot (using auto configuration). This is what I have so far:
#Bean
protected ThreadPoolTaskExecutor mvcTaskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setThreadNamePrefix("my-mvc-task-executor-");
executor.setCorePoolSize(5);
executor.setMaxPoolSize(200);
return executor;
}
#Bean
protected WebMvcConfigurer webMvcConfigurer() {
return new WebMvcConfigurerAdapter() {
#Override
public void configureAsyncSupport(AsyncSupportConfigurer configurer) {
configurer.setTaskExecutor(mvcTaskExecutor());
}
};
}
Does anyone have a better/faster way to do this?
-Joshua

One way to achieve this is to use Spring's ConcurrentTaskExceptor class. This class acts as adapter between Spring's TaskExecutor and JDK's Executor.
#Bean
protected WebMvcConfigurer webMvcConfigurer() {
return new WebMvcConfigurerAdapter() {
#Override
public void configureAsyncSupport(AsyncSupportConfigurer configurer) {
configurer.setTaskExecutor(new ConcurrentTaskExecutor(Executors.newFixedThreadPool(5)));
}
};
}
One problem with above is that you can't specify maximum pool size. But you can always create a new factory method, createThreadPool(int core, int max) to get you configurable thread pools.

Related

Spring Boot 3 context propagation in micrometer tracing

Spring Boot 3 has changed context propagation in tracing.
https://github.com/micrometer-metrics/tracing/wiki/Spring-Cloud-Sleuth-3.1-Migration-Guide#async-instrumentation
They deliver now library to this issue. I guess I don't quite understand how it works.
I have created a taskExecutor as in guide.
#Bean(name = "taskExecutor")
ThreadPoolTaskExecutor threadPoolTaskScheduler() {
ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor() {
#Override
protected ExecutorService initializeExecutor(ThreadFactory threadFactory, RejectedExecutionHandler rejectedExecutionHandler) {
ExecutorService executorService = super.initializeExecutor(threadFactory, rejectedExecutionHandler);
return ContextExecutorService.wrap(executorService, ContextSnapshot::captureAll);
}
};
threadPoolTaskExecutor.initialize();
return threadPoolTaskExecutor;
}
And I have marked #Async like this:
#Async("taskExecutor")
public void run() {
// invoke some service
}
But context is not propagated to child context in taskExecutor thread.
I was facing the same problem. Pls add this code to the configuration and everything works as expected.
#Configuration(proxyBeanMethods = false)
static class AsyncConfig implements AsyncConfigurer, WebMvcConfigurer {
#Override
public Executor getAsyncExecutor() {
return ContextExecutorService.wrap(Executors.newCachedThreadPool(), ContextSnapshot::captureAll);
}
#Override
public void configureAsyncSupport(AsyncSupportConfigurer configurer) {
configurer.setTaskExecutor(new SimpleAsyncTaskExecutor(r -> new Thread(ContextSnapshot.captureAll().wrap(r))));
}
}

Best approach to allocate dedicated background thread in Spring Boot

I need to create a dedicated thread listening on DatagramSocket.
Old school approach would be to create one during context creation:
#Bean
void beanDef() {
var thread = new Thread(myRunnable);
thread.setDaemon(true);
thread.start();
}
More modern approach would be to create an executor:
#Bean
Executor() {
var executor = Executors.newSingleThreadExecutor();
executor.submit(myRunnable);
}
Which one should I prefer?
Something like this would be a modern way to launch a background thread in Spring:
#Component
public class MySocketListenerLauncher {
private ExecutorService executorService;
#PostConstruct
public void init() {
BasicThreadFactory factory = new BasicThreadFactory.Builder()
.namingPattern("socket-listener-%d").build();
executorService = Executors.newSingleThreadExecutor(factory);
executorService.execute(new Runnable() {
#Override
public void run() {
// ... listen to socket ...
}
});
executorService.shutdown();
}
#PreDestroy
public void shutdown() {
if (executorService != null) {
executorService.shutdownNow();
}
}
}
Better use #Async annotation to create a thread in the background.
The #EnableAsync annotation switches on Spring's ability to run #Async methods in a background thread pool. This class also customizes the used Executor.
#Configuration
#EnableAsync
public class ThreadConfig {
#Bean(name = "specificTaskExecutor")
public TaskExecutor specificTaskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.initialize();
return executor;
}
}
and in below snippet
#Async("specificTaskExecutor")
public void runFromAnotherThreadPool() {
System.out.println("You function code here");
}
I hope this will help you.

Spring batch JdbcPagingItemReader not able to read all events

I had spring batch application like below (table name and query are edited for some general names)
when i execute this program, it was able to read 7500 events , i.e 3 times of chunk size and not able to read remaining records in oracle database. I had a table contain 50 million records and able to copy to another noSql database.
#EnableBatchProcessing
#SpringBootApplication
#EnableAutoConfiguration
public class MultiThreadPagingApp extends DefaultBatchConfigurer{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
#Bean
public DataSource dataSource() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("oracle.jdbc.OracleDriver");
dataSource.setUrl("jdbc:oracle:thin:#***********");
dataSource.setUsername("user");
dataSource.setPassword("password");
return dataSource;
}
#Override
public void setDataSource(DataSource dataSource) {}
#Bean
#StepScope
ItemReader<UserModel> dbReader() throws Exception {
JdbcPagingItemReader<UserModel> reader = new JdbcPagingItemReader<UserModel>();
final SqlPagingQueryProviderFactoryBean sqlPagingQueryProviderFactoryBean = new SqlPagingQueryProviderFactoryBean();
sqlPagingQueryProviderFactoryBean.setDataSource(dataSource);
sqlPagingQueryProviderFactoryBean.setSelectClause("select * ");
sqlPagingQueryProviderFactoryBean.setFromClause("from user");
sqlPagingQueryProviderFactoryBean.setWhereClause("where id>0");
sqlPagingQueryProviderFactoryBean.setSortKey("name");
reader.setQueryProvider(sqlPagingQueryProviderFactoryBean.getObject());
reader.setDataSource(dataSource);
reader.setPageSize(2500);
reader.setRowMapper(new BeanPropertyRowMapper<>(UserModel.class));
reader.afterPropertiesSet();
reader.setSaveState(true);
System.out.println("Reading users anonymized in chunks of {}"+ 2500);
return reader;
}
#Bean
public Dbwriter writer() {
return new Dbwriter(); // I had another class for this
}
#Bean
public Step step1() throws Exception {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(4);
taskExecutor.setMaxPoolSize(10);
taskExecutor.afterPropertiesSet();
return this.stepBuilderFactory.get("step1")
.<UserModel, UserModel>chunk(2500)
.reader(dbReader())
.writer(writer())
.taskExecutor(taskExecutor)
.build();
}
#Bean
public Job multithreadedJob() throws Exception {
return this.jobBuilderFactory.get("multithreadedJob")
.start(step1())
.build();
}
#Bean
public PlatformTransactionManager getTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository getJobRepo() throws Exception {
return new MapJobRepositoryFactoryBean(getTransactionManager()).getObject();
}
public static void main(String[] args) {
SpringApplication.run(MultiThreadPagingApp.class, args);
}
}
Can you help me how can i efficiently read all the records using spring batch, or help me any other approach to handle this. I had tried one approch mentioned here : http://techdive.in/java/jdbc-handling-huge-resultset
its taken 120 mins to read and save all records with single thread application. Since spring batch is best fit for this, I assume we can handle this scenario in quick time.
You are setting the saveState flag to true (BTW, it should be set before calling afterPropertiesSet) on a JdbcPagingItemReader and using this reader in a multithreaded step. However, it is documented to set this flag to false in a multi-threaded context.
Multi-threading with database readers is usually not the best option, I would recommend to use partitioning in your case.
I had the same problem, i fix it by changing my sortKey. I realize that the previous one wasn't different for every data records. So i replace it with ID whitch were different for every records of the Database

Spring Boot: how to use FilteringMessageListenerAdapter

I have a Spring Boot application which listens to messages on a Kafka queue. To filter those messages, have the following two classs
#Component
public class Listener implements MessageListener {
private final CountDownLatch latch1 = new CountDownLatch(1);
#Override
#KafkaListener(topics = "${spring.kafka.topic.boot}")
public void onMessage(Object o) {
System.out.println("LISTENER received payload *****");
this.latch1.countDown();
}
}
#Configuration
#EnableKafka
public class KafkaConfig {
#Autowired
private Listener listener;
#Bean
public FilteringMessageListenerAdapter filteringReceiver() {
return new FilteringMessageListenerAdapter(listener, recordFilterStrategy() );
}
public RecordFilterStrategy recordFilterStrategy() {
return new RecordFilterStrategy() {
#Override
public boolean filter(ConsumerRecord consumerRecord) {
System.out.println("IN FILTER");
return false;
}
};
}
}
While messages are being processed by the Listener class, the RecordFilterStrategy implementation is not being invoked. What is the correct way to use FilteringMessageListenerAdapter?
Thanks
The solution was as follows:
No need for the FilteringMessageListenerAdapter class.
Rather, create a ConcurrentKafkaListenerContainerFactory, rather than relying on what Spring Boot provides out of the box. Then, set the RecordFilterStrategy implementation on this class.
#Bean
ConcurrentKafkaListenerContainerFactory<Integer, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<Integer, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setRecordFilterStrategy(recordFilterStrategy());
return factory;
}

Spring batch with Spring Boot terminates before children process with AsyncItemProcessor

I'm using Spring Batch with a AsyncItemProcessor and things are behaving unexpectedly. Let me show first the code:
Followed a simple example as shown on the Spring Batch project:
#EnableBatchProcessing
#SpringBootApplication
#Import({HttpClientConfigurer.class, BatchJobConfigurer.class})
public class PerfilEletricoApp {
public static void main(String[] args) throws Exception {// NOSONAR
System.exit(SpringApplication.exit(SpringApplication.run(PerfilEletricoApp.class, args)));
//SpringApplication.run(PerfilEletricoApp.class, args);
}
}
-- EDIT
If I just sleep the main process go give a few seconds to slf4j to write the flush the logs, everything works as expected.
#EnableBatchProcessing
#SpringBootApplication
#Import({HttpClientConfigurer.class, BatchJobConfigurer.class})
public class PerfilEletricoApp {
public static void main(String[] args) throws Exception {// NOSONAR
//System.exit(SpringApplication.exit(SpringApplication.run(PerfilEletricoApp.class, args)));
ConfigurableApplicationContext context = SpringApplication.run(PerfilEletricoApp.class, args);
Thread.sleep(1000 * 5);
System.exit(SpringApplication.exit(context));
}
}
-- ENDOF EDIT
I'm reading a text file with a field and then using a AsyncItemProcessor to get a multithreaded processing, which consists of a Http GET on a URL to fetch some data, I'm also using a NoOpWriter to do nothing on the write part. I'm saving the results of the GET on the Processor part of the job (using log.trace / log.warn).
#Configuration
public class HttpClientConfigurer {
// [... property and configs omitted]
#Bean
public CloseableHttpClient createHttpClient() {
// ... creates and returns a poolable http client etc
}
}
As for the Job:
#Configuration
public class BatchJobConfigurer {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Value("${async.tps:10}")
private Integer tps;
#Value("${com.bemobi.perfilelerico.sourcedir:/AppServer/perfil-eletrico/source-dir/}")
private String sourceDir;
#Bean
public ItemReader<String> reader() {
MultiResourceItemReader<String> reader = new MultiResourceItemReader<>();
reader.setResources( new Resource[] { new FileSystemResource(sourceDir)});
reader.setDelegate((ResourceAwareItemReaderItemStream<? extends String>) flatItemReader());
return reader;
}
#Bean
public ItemReader<String> flatItemReader() {
FlatFileItemReader<String> itemReader = new FlatFileItemReader<>();
itemReader.setLineMapper(new DefaultLineMapper<String>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "sample-field-001"});
}});
setFieldSetMapper(new SimpleStringFieldSetMapper<>());
}});
return itemReader;
}
#Bean
public ItemProcessor asyncItemProcessor(){
AsyncItemProcessor<String, OiPaggoResponse> asyncItemProcessor = new AsyncItemProcessor<>();
asyncItemProcessor.setDelegate(processor());
asyncItemProcessor.setTaskExecutor(getAsyncExecutor());
return asyncItemProcessor;
}
#Bean
public ItemProcessor<String,OiPaggoResponse> processor(){
return new PerfilEletricoItemProcessor();
}
/**
* Using a NoOpItemWriter<T> so we satisfy spring batch flow but don't use writer for anything else.
* #return a NoOpItemWriter<OiPaggoResponse>
*/
#Bean
public ItemWriter<OiPaggoResponse> writer() {
return new NoOpItemWriter<>();
}
#Bean
protected Step step1() throws Exception {
/*
Problem starts here, If Use the processor() everything ends nicely, but if I insist on the asyncItemProcessor(), the job ends and the logs from processor are not stored on the disk.
*/
return this.steps.get("step1").<String, OiPaggoResponse> chunk(10)
.reader(reader())
.processor(asyncItemProcessor())
.build();
}
#Bean
public Job job() throws Exception {
return this.jobs.get("consulta-perfil-eletrico").start(step1()).build();
}
#Bean(name = "asyncExecutor")
public TaskExecutor getAsyncExecutor()
{
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(tps);
executor.setMaxPoolSize(tps);
executor.setQueueCapacity(tps * 1000);
executor.setRejectedExecutionHandler(new ThreadPoolExecutor.CallerRunsPolicy());
executor.setThreadNamePrefix("AsyncExecutor-");
return executor;
}
}
-- UPDATED WITH AsyncItemWriter (Working version)
/*Wrapped Writer*/
#Bean
public ItemWriter asyncItemWriter(){
AsyncItemWriter<OiPaggoResponse> asyncItemWriter = new AsyncItemWriter<>();
asyncItemWriter.setDelegate(writer());
return asyncItemWriter;
}
/*AsyncItemWriter defined on the steps*/
#Bean
protected Step step1() throws Exception {
return this.steps.get("step1").<String, OiPaggoResponse> chunk(10)
.reader(reader())
.processor(asyncItemProcessor())
.writer(asyncItemWriter())
.build();
}
--
Any thoughts on why the AsyncItemProcessor don't wait for all the children to to complete before send a OK-Completed signal to the context?
The issue is that the AsyncItemProcessor is creating Futures that no one is waiting for. Wrap your NoOpItemWriter in the AsyncItemWriter so that someone is waiting for the Futures. That will cause the job to complete as expected.

Resources