Spring Batch RabbitMQ issue - spring

I am looking to integrate the Spring Batch with RabbitMQ. I've developed code like below but no data is passing over the channel. What's wrong in the code ?
CustomerMessageChannel.java
public interface CustomerMessageChannel {
#Output("customerMessageChannel")
MessageChannel customerMessageChannel();
}
Customer.java
#AllArgsConstructor
#NoArgsConstructor
#Builder
#Data
#JsonIgnoreProperties(ignoreUnknown = true)
public class Customer implements Serializable {
private static final long serialVersionUID = 1L;
private Long id;
private String firstName;
private String lastName;
private String birthdate;
}
CustomerFieldSetMapper.java
public class CustomerFieldSetMapper implements FieldSetMapper<Customer> {
#Override
public Customer mapFieldSet(FieldSet fieldSet) throws BindException {
return Customer.builder()
.id(fieldSet.readLong("id"))
.firstName(fieldSet.readRawString("firstName"))
.lastName(fieldSet.readRawString("lastName"))
.birthdate(fieldSet.readRawString("birthdate"))
.build();
}
}
JobConfig.java
#Configuration
public class JobConfig {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private AmqpTemplate amqpTemplate;
#Bean
public FlatFileItemReader<Customer> customerItemReader() {
FlatFileItemReader<Customer> reader = new FlatFileItemReader<>();
reader.setLinesToSkip(1);
reader.setResource(new ClassPathResource("/data/customer.csv"));
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
tokenizer.setNames(new String[] { "id", "firstName", "lastName", "birthdate" });
DefaultLineMapper<Customer> customerLineMapper = new DefaultLineMapper<>();
customerLineMapper.setLineTokenizer(tokenizer);
customerLineMapper.setFieldSetMapper(new CustomerFieldSetMapper());
customerLineMapper.afterPropertiesSet();
reader.setLineMapper(customerLineMapper);
return reader;
}
#Bean
public AmqpItemWriter<Customer> amqpItemWriter() throws Exception{
return new AmqpItemWriter<>(amqpTemplate);
}
#Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1")
.<Customer, Customer>chunk(10)
.reader(customerItemReader())
.writer(amqpItemWriter())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.start(step1())
.build();
}
}
application.properties
spring.rabbitmq.host=localhost
spring.rabbitmq.port=5672
spring.rabbitmq.username=guest
spring.rabbitmq.password=guest
spring.cloud.stream.bindings.customerMessageChannel.destination=customerMessageChannel
spring.cloud.stream.default.contentType=application/json
customer.csv
id,firstName,lastName,
1, John, Doe,10-10-1952 10:10:10
2, Amy, Eugene,05-07-1985 17:10:00
3, Laverne, Mann,11-12-1988 10:10:10
4, Janice, Preston,19-02-1960 10:10:10
5, Pauline, Rios,29-08-1977 10:10:10
6, Perry, Burnside,10-03-1981 10:10:10
7, Todd, Kinsey,14-12-1998 10:10:10
8, Jacqueline, Hyde,20-03-1983 10:10:10
9, Rico, Hale,10-10-2000 10:10:10
10, Samuel, Lamm,11-11-1999 10:10:10
11, Robert, Coster,10-10-1972 10:10:10
12, Tamara, Soler,02-01-1978 10:10:10
13, Justin, Kramer,19-11-1951 10:10:10
14, Andrea, Law,14-10-1959 10:10:10
15, Laura, Porter,12-12-2010 10:10:10
16, Michael, Cantu,11-04-1999 10:10:10
17, Andrew, Thomas,04-05-1967 10:10:10
18, Jose, Hannah,16-09-1950 10:10:10
19, Valerie, Hilbert,13-06-1966 10:10:10
20, Patrick, Durham,12-10-1978 10:10:10
SpringBatchAmqpApplication.java
#EnableBatchProcessing
#SpringBootApplication
public class SpringBatchAmqpApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchAmqpApplication.class, args);
}
}

You need to configure an exchange and routing key for the template - by default, RabbitMQ will discard unroutable messages.
https://docs.spring.io/spring-boot/docs/current/reference/html/appendix-application-properties.html#spring.rabbitmq.template.exchange
https://docs.spring.io/spring-boot/docs/current/reference/html/appendix-application-properties.html#spring.rabbitmq.template.routing-key
You also don't seem to be referencing customerMessageChannel any place so it's not clear what you are expecting.
To consume from the spring-cloud-stream destination, you need #EnableBinding(CustomerMessageChannel.class) and a #StreamListener method. However, that annotation model is now deprecated in favor of the functional programming model.
https://docs.spring.io/spring-cloud-stream/docs/3.1.0/reference/html/spring-cloud-stream.html#spring-cloud-stream-overview-producing-consuming-messages

Related

How to run a job multiple times parallelly with different excels as input in spring batch

I have a use case where user upload different excel files where each file process parallelly and needs to save with job execution id for every row of the excel in h2 database.
But the issue I am facing here is when user uploads first file and the processing is going at back end and saving the excel data of every row with job id 1, but after if he uploading another excel file with out completing the first one with different data then the data related to first excel also getting saving with latest job execution id which is 2. So how to resolve this issue. so that each job data save with that particular id and has to run different jobs parlalley. this is the data for first excel sheet and this the data for second excel sheet and this is the output saving in h2 database.
This is the service class
#Service
public class BatchTestService {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher batchJobLauncher;
#Autowired
private WriterImpl writer;
public Job job(byte data[]) {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.flow(step(data))
.end()
.build();
}
#SneakyThrows
public PoiItemReader<TestEntity> reader(byte[] data) {
ReaderImpl reader = new ReaderImpl();
reader.setLinesToSkip(1);
reader.setResource(toResource(data, "TEST"));
reader.setRowMapper(new MapperClass());
return reader;
}
public Step step(byte data[]) {
return stepBuilderFactory.get("step").<TestEntity, TestEntity>chunk(2).reader(reader(data))
.writer(writer)
.build();
}
public ThreadPoolTaskExecutor getThreadPoolTaskExecutor(){
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(2);
taskExecutor.setMaxPoolSize(4);
taskExecutor.setThreadNamePrefix("test");
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
public void uploadExcel(MultipartFile file)
throws Exception {
String jobId = String.valueOf(System.currentTimeMillis());
JobParameters parameters = new JobParametersBuilder().addString("jobId", jobId)
.toJobParameters();
((SimpleJobLauncher) batchJobLauncher).setTaskExecutor(getThreadPoolTaskExecutor());
batchJobLauncher.run(job(file.getBytes()), parameters);
}
public static Resource toResource(byte bytesFile[], String sheetName) throws IOException {
ByteArrayInputStream bin = new ByteArrayInputStream(bytesFile);
XSSFWorkbook workbook = new XSSFWorkbook(bin);
var outputStream = new ByteArrayOutputStream();
workbook.write(outputStream);
return new ByteArrayResource(outputStream.toByteArray());
}
}
This is the config class.
#Configuration
public class BatchDataSourceConfig {
#Value("${spring.datasource.driver-class-name}")
private String driverName;
#Value("${spring.datasource.url}")
private String url;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverName);
dataSource.setUrl(url);
dataSource.setUsername("sa");
dataSource.setPassword("");
return dataSource;
}
#Bean
public JobLauncher batchJobLauncher(JobRepository jobRepository) {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}
}
This is reader class
public class ReaderImpl extends PoiItemReader<TestEntity> {}
This is writer class
#Component
public class WriterImpl implements ItemWriter<TestEntity> {
private static Logger logger = LoggerFactory.getLogger(WriterImpl.class);
#Autowired
private TestEntityRepository testEntityRepository;
private StepExecution stepExecution;
#BeforeStep
public void beforeStep(final StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
#Override
#SneakyThrows
public void write(List<? extends TestEntity> modelObjectList) {
logger.info("Writer is reached...");
Thread.sleep(3000);
for (TestEntity testEntity : modelObjectList) {
testEntity.setJobExecutionId(stepExecution.getJobExecutionId());
testEntityRepository.save(testEntity);
}
}
}
And also resprctive rowmapper class is also defined.
public class MapperClass implements RowMapper<TestEntity> {
#Override
public TestEntity mapRow(RowSet rowSet) {
TestEntity testEntity = new TestEntity();
testEntity.setStudentName(rowSet.getColumnValue(0));
testEntity.setRollNo(rowSet.getColumnValue(1));
testEntity.setSection(rowSet.getColumnValue(2));
return testEntity;
}
}
This is the model class
#AllArgsConstructor
#Data
#Entity
#NoArgsConstructor
#Table(name = "TEST_ENTITY")
public class TestEntity {
#GeneratedValue(strategy = GenerationType.AUTO)
#Id
private Integer id;
private String studentName;
private String rollNo;
private String section;
private Long jobExecutionId;
}

How can we take the result of `MethodInvokingTaskletAdapter` as a reader in the Spring Batch Step?

How can we take the result of MethodInvokingTaskletAdapter as a reader in the Spring Batch Step? Reference - https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#taskletStep and https://github.com/spring-projects/spring-batch/pull/567
Here is the code that I developed
JobConfiguration.java
#Configuration
public class JobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public CustomService service() {
return new CustomService();
}
#StepScope
#Bean
public MethodInvokingTaskletAdapter methodInvokingTasklet() {
MethodInvokingTaskletAdapter methodInvokingTaskletAdapter = new MethodInvokingTaskletAdapter();
methodInvokingTaskletAdapter.setTargetObject(service());
methodInvokingTaskletAdapter.setTargetMethod("getEmployees");
return methodInvokingTaskletAdapter;
}
#Bean
public Job methodInvokingJob() {
return this.jobBuilderFactory.get("methodInvokingJob")
.start(methodInvokingStep())
.build();
}
#Bean
public Step methodInvokingStep() {
// Looking to configure the Chunk based Step here, dont know how to do using MethodInvokingTaskletAdapter
return this.stepBuilderFactory.get("methodInvokingStep")
.tasklet(methodInvokingTasklet())
.build();
}
}
CustomService.java
public class CustomService {
public void serviceMethod(String message) {
System.out.println(message);
}
public void invokeMethod() {
System.out.println("=============== Your method has executed !");
}
public List<Employee> getEmployees(){
// In real world, it will be an GET API call to XYZ system
List<Employee> employees = new ArrayList<>();
employees.add(Employee.builder().firstName("Ravi").lastName("Shankar").email("ravi.shankar#gmail.com").age(30).build());
employees.add(Employee.builder().firstName("Parag").lastName("Rane").email("parag.rane#gmail.com").age(11).build());
employees.add(Employee.builder().firstName("Priya").lastName("Pande").email("priya.pande#gmail.com").age(40).build());
employees.add(Employee.builder().firstName("Kiran").lastName("khot").email("kiran.khot#gmail.com").age(50).build());
return employees;
}
}
Employee.java
#Data
#AllArgsConstructor
#NoArgsConstructor
#Builder
public class Employee {
private String firstName;
private String lastName;
private String email;
private int age;
}
MethodInvokingTaskletApplication.java
#EnableBatchProcessing
#SpringBootApplication
public class MethodInvokingTaskletApplication {
public static void main(String[] args) {
SpringApplication.run(MethodInvokingTaskletApplication.class, args);
}
}
To answer your question, you can't. The MethodInvokingTaskletAdapter is meant to adapt a POJO to a Tasklet. We have an ItemReaderAdapter that you can use to adapt a POJO to an ItemReader. You can read about it in the documentation here: https://docs.spring.io/spring-batch/docs/current/api/org/springframework/batch/item/adapter/ItemReaderAdapter.html
Now you'll have an issue with your service as configured in that each call to the delegating POJO is considered an item. That means that your item as configured will be a List<Employee> instead of just an Employee. Given your configuration states it's not the real service, I'll assume that your real service should return an Employee per call and null once the results are exhausted.
To update your configuration (with your service as it is configured in your question) in your sample:
...
#StepScope
#Bean
public ItemReaderAdapter itemReader() {
ItemReaderAdapter reader = new ItemReaderAdapter();
reader.setTargetObject(service());
reader.setTargetMethod("getEmployees");
return reader;
}
#Bean
public Job methodInvokingJob() {
return this.jobBuilderFactory.get("methodInvokingJob")
.start(methodInvokingStep())
.build();
}
#Bean
public Step methodInvokingStep() {
return this.stepBuilderFactory.get("methodInvokingStep")
.<List<Employee>, List<Employee>>chunk(methodInvokingTasklet())
.reader(itemReader())
// You'll need to define a writer...
.writer(itemWriter())
.build();
}
...

Simple embedded Kafka test example with spring boot

Edit FYI: working gitHub example
I was searching the internet and couldn't find a working and simple example of an embedded Kafka test.
My setup is:
Spring boot
Multiple #KafkaListener with different topics in one class
Embedded Kafka for test which is starting fine
Test with Kafkatemplate which is sending to topic but the
#KafkaListener methods are not receiving anything even after a huge sleep time
No warnings or errors are shown, only info spam from Kafka in logs
Please help me. There are mostly over configured or overengineered examples. I am sure it can be done simple.
Thanks, guys!
#Controller
public class KafkaController {
private static final Logger LOG = getLogger(KafkaController.class);
#KafkaListener(topics = "test.kafka.topic")
public void receiveDunningHead(final String payload) {
LOG.debug("Receiving event with payload [{}]", payload);
//I will do database stuff here which i could check in db for testing
}
}
private static String SENDER_TOPIC = "test.kafka.topic";
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, SENDER_TOPIC);
#Test
public void testSend() throws InterruptedException, ExecutionException {
Map<String, Object> senderProps = KafkaTestUtils.producerProps(embeddedKafka);
KafkaProducer<Integer, String> producer = new KafkaProducer<>(senderProps);
producer.send(new ProducerRecord<>(SENDER_TOPIC, 0, 0, "message00")).get();
producer.send(new ProducerRecord<>(SENDER_TOPIC, 0, 1, "message01")).get();
producer.send(new ProducerRecord<>(SENDER_TOPIC, 1, 0, "message10")).get();
Thread.sleep(10000);
}
Embedded Kafka tests work for me with below configs,
Annotation on test class
#EnableKafka
#SpringBootTest(classes = {KafkaController.class}) // Specify #KafkaListener class if its not the same class, or not loaded with test config
#EmbeddedKafka(
partitions = 1,
controlledShutdown = false,
brokerProperties = {
"listeners=PLAINTEXT://localhost:3333",
"port=3333"
})
public class KafkaConsumerTest {
#Autowired
KafkaEmbedded kafkaEmbeded;
#Autowired
KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
Before annotation for setup method
#Before
public void setUp() throws Exception {
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
ContainerTestUtils.waitForAssignment(messageListenerContainer,
kafkaEmbeded.getPartitionsPerTopic());
}
}
Note: I am not using #ClassRule for creating embedded Kafka rather auto-wiring #Autowired embeddedKafka
#Test
public void testReceive() throws Exception {
kafkaTemplate.send(topic, data);
}
Hope this helps!
Edit: Test configuration class marked with #TestConfiguration
#TestConfiguration
public class TestConfig {
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(KafkaTestUtils.producerProps(kafkaEmbedded));
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
KafkaTemplate<String, String> kafkaTemplate = new KafkaTemplate<>(producerFactory());
kafkaTemplate.setDefaultTopic(topic);
return kafkaTemplate;
}
Now #Test method will autowire KafkaTemplate and use is to send message
kafkaTemplate.send(topic, data);
Updated answer code block with above line
since the accepted answer doesn't compile or work for me. I find another solution based on https://blog.mimacom.com/testing-apache-kafka-with-spring-boot/ what I would like to share with you.
The dependency is 'spring-kafka-test' version: '2.2.7.RELEASE'
#RunWith(SpringRunner.class)
#EmbeddedKafka(partitions = 1, topics = { "testTopic" })
#SpringBootTest
public class SimpleKafkaTest {
private static final String TEST_TOPIC = "testTopic";
#Autowired
EmbeddedKafkaBroker embeddedKafkaBroker;
#Test
public void testReceivingKafkaEvents() {
Consumer<Integer, String> consumer = configureConsumer();
Producer<Integer, String> producer = configureProducer();
producer.send(new ProducerRecord<>(TEST_TOPIC, 123, "my-test-value"));
ConsumerRecord<Integer, String> singleRecord = KafkaTestUtils.getSingleRecord(consumer, TEST_TOPIC);
assertThat(singleRecord).isNotNull();
assertThat(singleRecord.key()).isEqualTo(123);
assertThat(singleRecord.value()).isEqualTo("my-test-value");
consumer.close();
producer.close();
}
private Consumer<Integer, String> configureConsumer() {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps("testGroup", "true", embeddedKafkaBroker);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
Consumer<Integer, String> consumer = new DefaultKafkaConsumerFactory<Integer, String>(consumerProps)
.createConsumer();
consumer.subscribe(Collections.singleton(TEST_TOPIC));
return consumer;
}
private Producer<Integer, String> configureProducer() {
Map<String, Object> producerProps = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafkaBroker));
return new DefaultKafkaProducerFactory<Integer, String>(producerProps).createProducer();
}
}
I solved the issue now
#BeforeClass
public static void setUpBeforeClass() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
}
while I was debugging, I saw that the embedded kaka server is taking a random port.
I couldn't find the configuration for it, so I am setting the kafka config same as the server. Looks still a bit ugly for me.
I would love to have just the #Mayur mentioned line
#EmbeddedKafka(partitions = 1, controlledShutdown = false, brokerProperties = {"listeners=PLAINTEXT://localhost:9092", "port=9092"})
but can't find the right dependency in the internet.
In integration testing, having fixed ports like 9092 is not recommended because multiple tests should have the flexibility to open their own ports from embedded instances. So, following implementation is something like that,
NB: this implementation is based on junit5(Jupiter:5.7.0) and spring-boot 2.3.4.RELEASE
TestClass:
#EnableKafka
#SpringBootTest(classes = {ConsumerTest.Config.class, Consumer.class})
#EmbeddedKafka(
partitions = 1,
controlledShutdown = false)
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class ConsumerTest {
#Autowired
private EmbeddedKafkaBroker kafkaEmbedded;
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#BeforeAll
public void setUp() throws Exception {
for (final MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
ContainerTestUtils.waitForAssignment(messageListenerContainer,
kafkaEmbedded.getPartitionsPerTopic());
}
}
#Value("${topic.name}")
private String topicName;
#Autowired
private KafkaTemplate<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestKafkaTemplate;
#Test
public void consume_success() {
requestKafkaTemplate.send(topicName, load);
}
#Configuration
#Import({
KafkaListenerConfig.class,
TopicConfig.class
})
public static class Config {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Bean
public ProducerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestProducerFactory() {
final Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestKafkaTemplate() {
return new KafkaTemplate<>(requestProducerFactory());
}
}
}
Listener Class:
#Component
public class Consumer {
#KafkaListener(
topics = "${topic.name}",
containerFactory = "listenerContainerFactory"
)
#Override
public void listener(
final ConsumerRecord<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> consumerRecord,
final #Payload Optional<Map<String, List<ImmutablePair<String, String>>>> payload
) {
}
}
Listner Config:
#Configuration
public class KafkaListenerConfig {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Value(value = "${topic.name}")
private String resolvedTreeQueueName;
#Bean
public ConsumerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> resolvedTreeConsumerFactory() {
final Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, resolvedTreeQueueName);
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new CustomDeserializer());
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> resolvedTreeListenerContainerFactory() {
final ConcurrentKafkaListenerContainerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(resolvedTreeConsumerFactory());
return factory;
}
}
TopicConfig:
#Configuration
public class TopicConfig {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Value(value = "${topic.name}")
private String requestQueue;
#Bean
public KafkaAdmin kafkaAdmin() {
Map<String, Object> configs = new HashMap<>();
configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
return new KafkaAdmin(configs);
}
#Bean
public NewTopic requestTopic() {
return new NewTopic(requestQueue, 1, (short) 1);
}
}
application.properties:
spring.kafka.bootstrap-servers=${spring.embedded.kafka.brokers}
This assignment is the most important assignment that would bind the embedded instance port to the KafkaTemplate and, KafkaListners.
Following the above implementation, you could open dynamic ports per test class and, it would be more convenient.

JPA #EntityListener does not work as expected

I am integrating Spring4 and Hibernate5, but there is a problem that I can't resolve.
I use #EntityListener annotation on the BaseEntity class that is a super class for other business model.
Also I use #MappedSuperclass on the BaseEntity.
But it don't work!
Use Spring base annotation and run application successfully.
Also I inserted a record to db.
So I think my configuration of project is current.
Any body let me know why?
Thanks very much.
This is BaseEntity class.
#MappedSuperclass
#EntityListeners(EntityListener.class)
public class BaseEntity implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
#Column(nullable = false, updatable = false)
private Date createDate;
#Column(nullable = false)
private Date modifyDate;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Date getCreateDate() {
return createDate;
}
public void setCreateDate(Date createDate) {
this.createDate = createDate;
}
public Date getModifyDate() {
return modifyDate;
}
public void setModifyDate(Date modifyDate) {
this.modifyDate = modifyDate;
}
}
This is EntityListener class.
public class EntityListener {
#PrePersist
public void prePersist(BaseEntity entity) {
entity.setCreateDate(new Date());
entity.setModifyDate(new Date());
}
#PreUpdate
public void preUpdate(BaseEntity entity) {
entity.setModifyDate(new Date());
}
}
The following is my project configuration base on Spring annotation.
#Configuration
#EnableWebMvc
//#ImportResource({ "classpath:xxxxx.xml" })
#PropertySources({
#PropertySource("classpath:application.properties")
})
#ComponentScan({"com.yeager.admin.persistence","com.yeager.admin.web","com.yeager.admin.service","com.yeager.admin.common"})
#EnableAspectJAutoProxy
//#EnableRetry
public class AppConfig {
#Bean(name = "multipartResolver")
public CommonsMultipartResolver getResolver() throws IOException {
CommonsMultipartResolver resolver = new CommonsMultipartResolver();
return resolver;
}
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
#Bean
public static SpringContext springContext() {
return new SpringContext();
}
}
The main configuration about DAL like this,
#Configuration
#EnableTransactionManagement
#PropertySource({"classpath:persistence-mysql.properties"})
public class PersistenceConfig {
#Autowired
private Environment env;
public PersistenceConfig() {
super();
}
#Bean
public LocalSessionFactoryBean sessionFactory() {
final LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan("com.yeager.admin.persistence.entity");
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public DataSource dataSource() {
ComboPooledDataSource comboPooledDataSource = new ComboPooledDataSource();
try {
comboPooledDataSource.setDriverClass(env.getProperty("jdbc.driver"));
} catch (PropertyVetoException e) {
e.printStackTrace();
}
comboPooledDataSource.setJdbcUrl(env.getProperty("jdbc.url"));
comboPooledDataSource.setUser(env.getProperty("jdbc.username"));
comboPooledDataSource.setPassword(env.getProperty("jdbc.password"));
comboPooledDataSource.setInitialPoolSize(Integer.valueOf(env.getProperty("datasource.pool.initialPoolSize")));
return comboPooledDataSource;
}
#Bean
public PlatformTransactionManager transactionManager() {
final HibernateTransactionManager transactionManager = new HibernateTransactionManager();
transactionManager.setSessionFactory(sessionFactory().getObject());
return transactionManager;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
private final Properties hibernateProperties() {
final Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.dialect", env.getProperty("hibernate.dialect"));
hibernateProperties.setProperty("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
hibernateProperties.setProperty("hibernate.generate_statistics",env.getProperty("hibernate.generate_statistics"));
hibernateProperties.setProperty("hibernate.jdbc.fetch_size", env.getProperty("hibernate.jdbc.fetch_size"));
hibernateProperties.setProperty("hibernate.jdbc.batch_size", env.getProperty("hibernate.jdbc.batch_size"));
hibernateProperties.setProperty("hibernate.max_fetch_depth", env.getProperty("hibernate.max_fetch_depth"));
hibernateProperties.setProperty("hibernate.cache.use_second_level_cache",env.getProperty("hibernate.cache.use_second_level_cache"));
hibernateProperties.setProperty("hibernate.cache.use_query_cache",env.getProperty("hibernate.cache.use_query_cache"));
// hibernateProperties.setProperty("hibernate.cache.provider_class",env.getProperty("hibernate.cache.provider_class"));
hibernateProperties.setProperty("hibernate.hbm2ddl.auto", "update");
return hibernateProperties;
}
}
I use LocalSessionFactoryBean class of Hibernate rather than EntityManager class of JPA. I wonder if this cause ?
--------------- 6.19 --------------
I am wrong. I don't should use #EntityListener annotation base on Spring LocalSessionFactoryBean class.
For hibernate5, there is a special configuration way.
http://docs.jboss.org/hibernate/orm/5.2/userguide/html_single/Hibernate_User_Guide.html#annotations-jpa-entitylisteners
Now, I modify my code as following,
#Component
public class EntityEventListener {
#Autowired
private SessionFactory sessionFactory;
#PostConstruct
public void registerListeners(){
EventListenerRegistry eventListenerRegistry = ((SessionFactoryImplementor) sessionFactory).getServiceRegistry().getService(EventListenerRegistry.class);
eventListenerRegistry.prependListeners(EventType.PRE_INSERT, PreInsertEntityListener.class);
}
}
PreInsertEntityListener
public class PreInsertEntityListener implements PreInsertEventListener {
#Override
public boolean onPreInsert(PreInsertEvent event) {
// if (event.getEntity() instanceof AdminUser){
// ((AdminUser) event.getEntity()).setCreateDate(new Date());
// ((AdminUser) event.getEntity()).setModifyDate(new Date());
// }
BaseEntity baseEntity = (BaseEntity) event.getEntity();
baseEntity.setCreateDate(new Date());
baseEntity.setModifyDate(new Date());
return false;
}
}
But, I have a other problem.
I read hibernate doc and search many information about this. My code don't work already when I insert entity data.
Please help me, thanks!
Although you did neither post the concrete / derived entity nor the business code to persist it, the code you posted seems correct.
For giving it a small test I added a generated UID to the super class and created a concrete entity:
import javax.persistence.Entity;
#Entity
public class DerivedEntity extends BaseEntity {
private static final long serialVersionUID = -6441043639437893962L;
}
And since you mentioned Spring, here is a Spring Data JPA repository to save it:
import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;
#Repository
public interface DerivedEntityRepository extends CrudRepository<DerivedEntity, Long> {
}
This small test should show that the (#PrePersist) listener works:
import static org.assertj.core.api.Assertions.assertThat;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.transaction.annotation.Transactional;
#RunWith(SpringRunner.class)
#Transactional
#SpringBootTest
public class DerivedEntityRepositoryTests {
#Autowired
private DerivedEntityRepository derivedEntityRepository;
#Test
public void insertDerivedEntity() {
DerivedEntity entity = new DerivedEntity();
entity = derivedEntityRepository.save(entity);
assertThat(entity.getCreateDate()).isNotNull();
}
}
And just to mention it, if you don't want to enhance your custom listener in future, the existing Spring Data JPA AuditingEntityListener does exactly what you are doing at the moment (and even more). In this case you could just enhance a #Configuration class with #EnableJpaAuditing and modify your BaseEntity as following:
#MappedSuperclass
#EntityListeners(AuditingEntityListener.class)
public class BaseEntity implements Serializable {
// ...
#CreatedDate
#Column(nullable = false, updatable = false)
private Date createDate;
#LastModifiedDate
#Column(nullable = false)
private Date modifyDate;
// ...
}
That would make your custom EntityListener dispensable.
Just take a look Spring JPA Auditing for more information. If you want to enhance auditing with Hibernate, try Hibernate Envers.
I ran into this same issue and in my case the listener defined with #EntityListeners was referring to class (not in the same classloader) in another package and it wasn't being scanned. After adding the class to my persistence context it began working as expected.
So always be sure that any classes related to the persistence are added to the persistence context.
Thanks very much for everyone. I have resolved this problem.
I will share my solution, hope it's helpful for you if you are doing same things.
First, my starting point is wrong. Because I use JPA before, so I use acquiescently #EntityListener annotation when I integrate Spring4 and Hibernate5.
Then, I read Hibernate doc and many relevant article and found there is a new way to implement entity listener. See hibernate doc
Finally, my solution is following.
This is my BaseEntity class.
#MappedSuperclass
public class BaseEntity implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
#Column(nullable = false, updatable = false)
private Date createDate;
#Column(nullable = false)
private Date modifyDate;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Date getCreateDate() {
return createDate;
}
public void setCreateDate(Date createDate) {
this.createDate = createDate;
}
public Date getModifyDate() {
return modifyDate;
}
public void setModifyDate(Date modifyDate) {
this.modifyDate = modifyDate;
}
}
First of all, you need to define EntityListener class.
public class EntityListener implements PreInsertEventListener, PreUpdateEventListener {
private static final String CREATE_DATE_PROPERTY = "createDate";
private static final String MODIFY_DATE_PROPERTY = "modifyDate";
#Override
public boolean onPreInsert(PreInsertEvent event) {
if (event.getEntity() instanceof BaseEntity){
//property name of entity
String[] propertyNames = event.getPersister().getEntityMetamodel().getPropertyNames();
//property value of entity
Object[] state = event.getState();
for (int i = 0; i < propertyNames.length ; i ++) {
if (CREATE_DATE_PROPERTY.equals(propertyNames[i]) || MODIFY_DATE_PROPERTY.equals(propertyNames[i])){
state[i] = new Date();
}
}
}
return false;
}
#Override
public boolean onPreUpdate(PreUpdateEvent event) {
if (event.getEntity() instanceof BaseEntity){
//property name of entity
String[] propertyNames = event.getPersister().getEntityMetamodel().getPropertyNames();
//property value of entity
Object[] state = event.getState();
for (int i = 0; i < propertyNames.length ; i ++) {
if (MODIFY_DATE_PROPERTY.equals(propertyNames[i])){
state[i] = new Date();
}
}
}
return false;
}
}
Last, you should register entity event listener.
#SuppressWarnings("unchecked")
#Component
public class EntityEventListenerRegistry {
#Autowired
private SessionFactory sessionFactory;
/**
* EventListenerRegistry:http://docs.jboss.org/hibernate/orm/5.2/userguide/html_single/Hibernate_User_Guide.html#annotations-jpa-entitylisteners
*/
#PostConstruct
public void registerListeners(){
EventListenerRegistry eventListenerRegistry = ((SessionFactoryImplementor) sessionFactory).getServiceRegistry().getService(EventListenerRegistry.class);
eventListenerRegistry.prependListeners(EventType.PRE_INSERT, EntityListener.class);
eventListenerRegistry.prependListeners(EventType.PRE_UPDATE, EntityListener.class);
}
}

JobLauncherTestUtils throws NoUniqueBeanDefinitionException while trying to test spring batch steps

I am using Spring boot and Spring batch. I have defined more than one job.
I am trying to build junit to test specific task within a job.
Therefor I am using the JobLauncherTestUtils library.
When I run my test case I always get NoUniqueBeanDefinitionException.
This is my test class:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = {BatchConfiguration.class})
public class ProcessFileJobTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
public JobLauncherTestUtils jobLauncherTestUtils() {
JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
jobLauncherTestUtils.setJob(jobUnderTest());
return jobLauncherTestUtils;
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(processIdFileStep())
.build();
}
#Bean
public Step processIdFileStep() {
return stepBuilder.get("processIdFileStep")
.<PushItemDTO, PushItemDTO>chunk(1) //important to be one in this case to commit after every line read
.reader(reader(null))
.processor(processor(null, null, null, null))
.writer(writer())
// .faultTolerant()
// .skipLimit(10) //default is set to 0
// .skip(MySQLIntegrityConstraintViolationException.class)
.build();
}
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemStreamReader<PushItemDTO> reader(#Value("#{jobExecutionContext[filePath]}") String filePath) {
...
return itemReader;
}
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemProcessor<PushItemDTO, PushItemDTO> processor(#Value("#{jobParameters[pushMessage]}") String pushMessage,
#Value("#{jobParameters[jobId]}") String jobId,
#Value("#{jobParameters[taskId]}") String taskId,
#Value("#{jobParameters[refId]}") String refId)
{
return new PushItemProcessor(pushMessage,jobId,taskId,refId);
}
#Bean
public LineMapper<PushItemDTO> lineMapper() {
DefaultLineMapper<PushItemDTO> lineMapper = new DefaultLineMapper<PushItemDTO>();
...
return lineMapper;
}
#Bean
public ItemWriter writer() {
return new someWriter();
}
}
#Autowired
protected JobLauncher jobLauncher;
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void processIdFileStepTest1() throws Exception {
JobParameters jobParameters = new JobParametersBuilder().addString("filePath", "C:\\etc\\files\\2015_02_02").toJobParameters();
JobExecution jobExecution = jobLauncherTestUtils.launchStep("processIdFileStep",jobParameters);
}
and thats the exception:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.batch.core.Job] is defined: expected single matching bean but found 3: jobUnderTest,executeToolJob,processFileJob
Any idea?
Thanks.
added BatchConfiguration class:
package com.mycompany.notification_processor_service.batch.config;
import com.mycompany.notification_processor_service.common.config.CommonConfiguration;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.*;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import javax.sql.DataSource;
#ComponentScan("com.mycompany.notification_processor_service.batch")
#PropertySource("classpath:application.properties")
#Configuration
#Import({CommonConfiguration.class})
#ImportResource({"classpath:applicationContext-pushExecuterService.xml"/*,"classpath:si/integration-context.xml"*/})
public class BatchConfiguration {
#Value("${database.driver}")
private String databaseDriver;
#Value("${database.url}")
private String databaseUrl;
#Value("${database.username}")
private String databaseUsername;
#Value("${database.password}")
private String databasePassword;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(databaseDriver);
dataSource.setUrl(databaseUrl);
dataSource.setUsername(databaseUsername);
dataSource.setPassword(databasePassword);
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
and this is CommonConfiguration
#ComponentScan("com.mycompany.notification_processor_service")
#Configuration
#EnableJpaRepositories(basePackages = {"com.mycompany.notification_processor_service.common.repository.jpa"})
#EnableCouchbaseRepositories(basePackages = {"com.mycompany.notification_processor_service.common.repository.couchbase"})
#EntityScan({"com.mycompany.notification_processor_service"})
#EnableAutoConfiguration
#EnableTransactionManagement
#EnableAsync
public class CommonConfiguration {
}
I had the same issue and the easier way is injecting in the setter of JobLauncherTestUtils like Mariusz explained in Jira of Spring:
#Bean
public JobLauncherTestUtils getJobLauncherTestUtils() {
return new JobLauncherTestUtils() {
#Override
#Autowired
public void setJob(#Qualifier("ncsvImportJob") Job job) {
super.setJob(job);
}
};
}
So I see the jobUnderTest bean. Somewhere in all those imports, you're importing the two other jobs as well. I see your BatchConfiguration class imports other stuff as well as you having component scanning turned on. Carefully trace through all your configurations. Something is picking up the definitions for those beans.
I also ran into this issue and couldn't have JobLauncherTestUtils to work properly. It might be caused by this issue
I ended up autowiring the SimpleJobLauncher and my Job into the unit test, and simply
launcher.run(importAccountingDetailJob, params);
An old post, but i thought of providing my solution as well.
In this case i am automatically registering a JobLauncherTestUtils per job
#Configuration
public class TestConfig {
private static final Logger logger = LoggerFactory.getLogger(TestConfig.class);
#Autowired
private AbstractAutowireCapableBeanFactory beanFactory;
#Autowired
private List<Job> jobs;
#PostConstruct
public void registerServices() {
jobs.forEach(j->{
JobLauncherTestUtils u = create(j);
final String name = j.getName()+"TestUtils"
beanFactory.registerSingleton(name,u);
beanFactory.autowireBean(u);
logger.info("Registered JobLauncherTestUtils {}",name);
});
}
private JobLauncherTestUtils create(final Job j) {
return new MyJobLauncherTestUtils(j);
}
private static class MyJobLauncherTestUtils extends JobLauncherTestUtils {
MyJobLauncherTestUtils(Job j) {
this.setJob(j);
}
#Override // to remove #Autowire from base class
public void setJob(Job job) {
super.setJob(job);
}
}
}

Resources