How can we take the result of `MethodInvokingTaskletAdapter` as a reader in the Spring Batch Step? - spring

How can we take the result of MethodInvokingTaskletAdapter as a reader in the Spring Batch Step? Reference - https://docs.spring.io/spring-batch/docs/current/reference/html/index-single.html#taskletStep and https://github.com/spring-projects/spring-batch/pull/567
Here is the code that I developed
JobConfiguration.java
#Configuration
public class JobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public CustomService service() {
return new CustomService();
}
#StepScope
#Bean
public MethodInvokingTaskletAdapter methodInvokingTasklet() {
MethodInvokingTaskletAdapter methodInvokingTaskletAdapter = new MethodInvokingTaskletAdapter();
methodInvokingTaskletAdapter.setTargetObject(service());
methodInvokingTaskletAdapter.setTargetMethod("getEmployees");
return methodInvokingTaskletAdapter;
}
#Bean
public Job methodInvokingJob() {
return this.jobBuilderFactory.get("methodInvokingJob")
.start(methodInvokingStep())
.build();
}
#Bean
public Step methodInvokingStep() {
// Looking to configure the Chunk based Step here, dont know how to do using MethodInvokingTaskletAdapter
return this.stepBuilderFactory.get("methodInvokingStep")
.tasklet(methodInvokingTasklet())
.build();
}
}
CustomService.java
public class CustomService {
public void serviceMethod(String message) {
System.out.println(message);
}
public void invokeMethod() {
System.out.println("=============== Your method has executed !");
}
public List<Employee> getEmployees(){
// In real world, it will be an GET API call to XYZ system
List<Employee> employees = new ArrayList<>();
employees.add(Employee.builder().firstName("Ravi").lastName("Shankar").email("ravi.shankar#gmail.com").age(30).build());
employees.add(Employee.builder().firstName("Parag").lastName("Rane").email("parag.rane#gmail.com").age(11).build());
employees.add(Employee.builder().firstName("Priya").lastName("Pande").email("priya.pande#gmail.com").age(40).build());
employees.add(Employee.builder().firstName("Kiran").lastName("khot").email("kiran.khot#gmail.com").age(50).build());
return employees;
}
}
Employee.java
#Data
#AllArgsConstructor
#NoArgsConstructor
#Builder
public class Employee {
private String firstName;
private String lastName;
private String email;
private int age;
}
MethodInvokingTaskletApplication.java
#EnableBatchProcessing
#SpringBootApplication
public class MethodInvokingTaskletApplication {
public static void main(String[] args) {
SpringApplication.run(MethodInvokingTaskletApplication.class, args);
}
}

To answer your question, you can't. The MethodInvokingTaskletAdapter is meant to adapt a POJO to a Tasklet. We have an ItemReaderAdapter that you can use to adapt a POJO to an ItemReader. You can read about it in the documentation here: https://docs.spring.io/spring-batch/docs/current/api/org/springframework/batch/item/adapter/ItemReaderAdapter.html
Now you'll have an issue with your service as configured in that each call to the delegating POJO is considered an item. That means that your item as configured will be a List<Employee> instead of just an Employee. Given your configuration states it's not the real service, I'll assume that your real service should return an Employee per call and null once the results are exhausted.
To update your configuration (with your service as it is configured in your question) in your sample:
...
#StepScope
#Bean
public ItemReaderAdapter itemReader() {
ItemReaderAdapter reader = new ItemReaderAdapter();
reader.setTargetObject(service());
reader.setTargetMethod("getEmployees");
return reader;
}
#Bean
public Job methodInvokingJob() {
return this.jobBuilderFactory.get("methodInvokingJob")
.start(methodInvokingStep())
.build();
}
#Bean
public Step methodInvokingStep() {
return this.stepBuilderFactory.get("methodInvokingStep")
.<List<Employee>, List<Employee>>chunk(methodInvokingTasklet())
.reader(itemReader())
// You'll need to define a writer...
.writer(itemWriter())
.build();
}
...

Related

Sprinboot Jackson Date Serialize

i face a litle problem with Jackson on springboot web, i'm expecting this kind of response.
{
'date': 'string of date'
}
but i get back this
[
0: java.sql.date,
1: long number of timestamp
]
please how to configure springboot to avoid this?
#Configuration
public class BootConfiguration {
#Bean
#Primary
public PasswordEncoder passwordEncoder() {
return PasswordEncoderFactories.createDelegatingPasswordEncoder();
}
#Bean
public JavaTimeModule javaTimeModule() {
return new JavaTimeModule();
}
#Bean
public Jdk8Module jdk8TimeModule() {
return new Jdk8Module();
}
#Bean
public CoreJackson2Module coreJackson2Module() {
return new CoreJackson2Module();
}
#Bean
public OAuth2AuthorizationServerJackson2Module authorizationServerJackson2Module(){
return new OAuth2AuthorizationServerJackson2Module();
}
public List<Module> securityModules(){
ClassLoader classLoader = JdbcOAuth2AuthorizationService.class.getClassLoader();
return SecurityJackson2Modules.getModules(classLoader);
}
#Bean
#Primary
#Order(Ordered.HIGHEST_PRECEDENCE)
public ObjectMapper objectMapper() {
ObjectMapper mapper = new ObjectMapper().findAndRegisterModules();
mapper.registerModule(coreJackson2Module());
mapper.registerModule(javaTimeModule());
mapper.registerModule(jdk8TimeModule());
mapper.registerModules(securityModules());
mapper.registerModule(authorizationServerJackson2Module());
mapper.addMixIn(UserPrincipal.class, Object.class);
return mapper;
}
}
this is the controller method where
#GetMapping(value = "/authentications", params = {"pge","lmt", "slug"})
public ResponseEntity<?> getLastLogins(
Authentication authentication,
#RequestParam("lmt")Integer limit,
#RequestParam("pge")Integer page,
#RequestParam("slug")String slug){
return new ResponseEntity<>(loginManager.findAllAuthentications(authentication.getName(),slug,limit,page), HttpStatus.OK);
}
}
and this is the model containing the properties i try to get back
#Data
public class User implements Serializable {
private String firstName;
private String lastName;
private Date date;
private Role role;
}

How to use error-channel for catching exception in Spring Integration?

What I am trying to do? : I am new to Spring Integration and already have read many similar questions regarding error handling but I don't understand how to catch exceptions using error-channel?
What I have done so far:
#EnableIntegration
#IntegrationComponentScan
#Configuration
public class TcpClientConfig implements ApplicationEventPublisherAware {
private ApplicationEventPublisher applicationEventPublisher;
private final ConnectionProperty connectionProperty;
#Override
public void setApplicationEventPublisher(ApplicationEventPublisher applicationEventPublisher) {
this.applicationEventPublisher = applicationEventPublisher;
}
TcpClientConfig(ConnectionProperty connectionProperty) {
this.connectionProperty = connectionProperty;
}
#Bean
public AbstractClientConnectionFactory clientConnectionFactory() {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory =
getTcpNioClientConnectionFactoryOf(
connectionProperty.getPrimaryHSMServerIpAddress(),
connectionProperty.getPrimaryHSMServerPort());
final List<AbstractClientConnectionFactory> fallBackConnections = getFallBackConnections();
fallBackConnections.add(tcpNioClientConnectionFactory);
final FailoverClientConnectionFactory failoverClientConnectionFactory =
new FailoverClientConnectionFactory(fallBackConnections);
return new CachingClientConnectionFactory(
failoverClientConnectionFactory, connectionProperty.getConnectionPoolSize());
}
#Bean
DefaultTcpNioSSLConnectionSupport connectionSupport() {
final DefaultTcpSSLContextSupport defaultTcpSSLContextSupport =
new DefaultTcpSSLContextSupport(
connectionProperty.getKeystorePath(),
connectionProperty.getTrustStorePath(),
connectionProperty.getKeystorePassword(),
connectionProperty.getTruststorePassword());
final String protocol = "TLSv1.2";
defaultTcpSSLContextSupport.setProtocol(protocol);
return new DefaultTcpNioSSLConnectionSupport(defaultTcpSSLContextSupport, false);
}
#Bean
public MessageChannel outboundChannel() {
return new DirectChannel();
}
#Bean
#ServiceActivator(inputChannel = "outboundChannel")
public MessageHandler outboundGateway(AbstractClientConnectionFactory clientConnectionFactory) {
TcpOutboundGateway tcpOutboundGateway = new TcpOutboundGateway();
tcpOutboundGateway.setConnectionFactory(clientConnectionFactory);
return tcpOutboundGateway;
}
#Bean
#ServiceActivator(inputChannel = "error-channel")
public void handleError(ErrorMessage em) {
throw new RuntimeException(String.valueOf(em));
}
private List<AbstractClientConnectionFactory> getFallBackConnections() {
final int size = connectionProperty.getAdditionalHSMServersConfig().size();
List<AbstractClientConnectionFactory> collector = new ArrayList<>(size);
for (final Map.Entry<String, Integer> server :
connectionProperty.getAdditionalHSMServersConfig().entrySet()) {
collector.add(getTcpNioClientConnectionFactoryOf(server.getKey(), server.getValue()));
}
return collector;
}
private TcpNioClientConnectionFactory getTcpNioClientConnectionFactoryOf(
final String ipAddress, final int port) {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory =
new TcpNioClientConnectionFactory(ipAddress, port);
tcpNioClientConnectionFactory.setUsingDirectBuffers(true);
tcpNioClientConnectionFactory.setDeserializer(new CustomDeserializer());
tcpNioClientConnectionFactory.setApplicationEventPublisher(applicationEventPublisher);
tcpNioClientConnectionFactory.setSoKeepAlive(true);
tcpNioClientConnectionFactory.setConnectTimeout(connectionProperty.getConnectionTimeout());
tcpNioClientConnectionFactory.setSoTcpNoDelay(true);
tcpNioClientConnectionFactory.setTcpNioConnectionSupport(connectionSupport());
return tcpNioClientConnectionFactory;
}
}
Gateway
#Component
#MessagingGateway(defaultRequestChannel = "outboundChannel",errorChannel ="error-channel" )
public interface TcpClientGateway {
String send(String message);
}
Also currently, I am facing
required a bean of type org.springframework.messaging.support.ErrorMessage that could not be found
I need some assistance!
Thanking you in advance,
EDIT
#AllArgsConstructor
#Service
public class AsyncNonBlockingClient implements Connector {
TcpClientGateway tcpClientGateway;
#Override
public String send(final String payload) {
return tcpClientGateway.send(payload);
}
}
See documentation about messaging annotation:
Your problem is here: https://docs.spring.io/spring-integration/docs/current/reference/html/configuration.html#annotations_on_beans
#Bean
#ServiceActivator(inputChannel = "error-channel")
public void handleError(ErrorMessage em) {
This is a plain POJO method, therefore it cannot be marked with a #Bean. You use a #Bean really for beans to expose. Then you decide if that has to be a #ServiceActivator or not. So, just remove #Bean from this method and your error-channel consumer should be OK.

How to run a job multiple times parallelly with different excels as input in spring batch

I have a use case where user upload different excel files where each file process parallelly and needs to save with job execution id for every row of the excel in h2 database.
But the issue I am facing here is when user uploads first file and the processing is going at back end and saving the excel data of every row with job id 1, but after if he uploading another excel file with out completing the first one with different data then the data related to first excel also getting saving with latest job execution id which is 2. So how to resolve this issue. so that each job data save with that particular id and has to run different jobs parlalley. this is the data for first excel sheet and this the data for second excel sheet and this is the output saving in h2 database.
This is the service class
#Service
public class BatchTestService {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher batchJobLauncher;
#Autowired
private WriterImpl writer;
public Job job(byte data[]) {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.flow(step(data))
.end()
.build();
}
#SneakyThrows
public PoiItemReader<TestEntity> reader(byte[] data) {
ReaderImpl reader = new ReaderImpl();
reader.setLinesToSkip(1);
reader.setResource(toResource(data, "TEST"));
reader.setRowMapper(new MapperClass());
return reader;
}
public Step step(byte data[]) {
return stepBuilderFactory.get("step").<TestEntity, TestEntity>chunk(2).reader(reader(data))
.writer(writer)
.build();
}
public ThreadPoolTaskExecutor getThreadPoolTaskExecutor(){
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(2);
taskExecutor.setMaxPoolSize(4);
taskExecutor.setThreadNamePrefix("test");
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
public void uploadExcel(MultipartFile file)
throws Exception {
String jobId = String.valueOf(System.currentTimeMillis());
JobParameters parameters = new JobParametersBuilder().addString("jobId", jobId)
.toJobParameters();
((SimpleJobLauncher) batchJobLauncher).setTaskExecutor(getThreadPoolTaskExecutor());
batchJobLauncher.run(job(file.getBytes()), parameters);
}
public static Resource toResource(byte bytesFile[], String sheetName) throws IOException {
ByteArrayInputStream bin = new ByteArrayInputStream(bytesFile);
XSSFWorkbook workbook = new XSSFWorkbook(bin);
var outputStream = new ByteArrayOutputStream();
workbook.write(outputStream);
return new ByteArrayResource(outputStream.toByteArray());
}
}
This is the config class.
#Configuration
public class BatchDataSourceConfig {
#Value("${spring.datasource.driver-class-name}")
private String driverName;
#Value("${spring.datasource.url}")
private String url;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverName);
dataSource.setUrl(url);
dataSource.setUsername("sa");
dataSource.setPassword("");
return dataSource;
}
#Bean
public JobLauncher batchJobLauncher(JobRepository jobRepository) {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}
}
This is reader class
public class ReaderImpl extends PoiItemReader<TestEntity> {}
This is writer class
#Component
public class WriterImpl implements ItemWriter<TestEntity> {
private static Logger logger = LoggerFactory.getLogger(WriterImpl.class);
#Autowired
private TestEntityRepository testEntityRepository;
private StepExecution stepExecution;
#BeforeStep
public void beforeStep(final StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
#Override
#SneakyThrows
public void write(List<? extends TestEntity> modelObjectList) {
logger.info("Writer is reached...");
Thread.sleep(3000);
for (TestEntity testEntity : modelObjectList) {
testEntity.setJobExecutionId(stepExecution.getJobExecutionId());
testEntityRepository.save(testEntity);
}
}
}
And also resprctive rowmapper class is also defined.
public class MapperClass implements RowMapper<TestEntity> {
#Override
public TestEntity mapRow(RowSet rowSet) {
TestEntity testEntity = new TestEntity();
testEntity.setStudentName(rowSet.getColumnValue(0));
testEntity.setRollNo(rowSet.getColumnValue(1));
testEntity.setSection(rowSet.getColumnValue(2));
return testEntity;
}
}
This is the model class
#AllArgsConstructor
#Data
#Entity
#NoArgsConstructor
#Table(name = "TEST_ENTITY")
public class TestEntity {
#GeneratedValue(strategy = GenerationType.AUTO)
#Id
private Integer id;
private String studentName;
private String rollNo;
private String section;
private Long jobExecutionId;
}

How to use ClassifierCompositeItemProcessor in Spring Batch and write data into same table for Insert and Upsert?

I went through the link - https://github.com/spring-projects/spring-batch/blob/master/spring-batch-infrastructure/src/test/java/org/springframework/batch/item/support/ClassifierCompositeItemProcessorTests.java, but did not strike much out of it.
I am trying to replace ETL Informatica mapping logic into the Batch. I am looking to separate out Status=I and Status=U into separate (Individual) processor and then further perform lookup and massage the data and then write those records directly into the table for Status=I and for status=U, perform another complex logic (like lookups, massaging and match and merge logic) and then upsert those records again into the same table.
I've tried to do POC, where I am looking to segregate the records in the processor
CustomerClassifier.java
public class CustomerClassifier implements Classifier<Customer, ItemProcessor<Customer, Customer>> {
private ItemProcessor<Customer, Customer> insertCustomerProcessor;
private ItemProcessor<Customer, Customer> updateCustomerProcessor;
public CustomerClassifier(ItemProcessor<Customer, Customer> evenCustomerProcessor, ItemProcessor<Customer, Customer> oddCustomerProcessor) {
this.insertCustomerProcessor= insertCustomerProcessor;
this.updateCustomerProcessor= updateCustomerProcessor;
}
#Override
public ItemProcessor<Customer, Customer> classify(Customer customer) {
return customer.getStatus().equals("I") ? insertCustomerProcessor : updateCustomerProcessor;
}
}
OddCustomerProcessor.java
public class OddCustomerProcessor implements ItemProcessor<Customer, Customer> {
#Override
public Customer process(Customer item) throws Exception {
Customer customer = new Customer();
// Perform some msaaging and lookups here
customer.setId(item.getId());
customer.setFirstName(item.getFirstName());
customer.setLastName(item.getLastName());
customer.setBirthdate(item.getBirthdate());
customer.setStatus(item.getStatus());
return customer;
}
}
EvenCustomerProcessor.java
public class EvenCustomerProcessor implements ItemProcessor<Customer, Customer> {
#Override
public Customer process(Customer item) throws Exception {
Customer customer = new Customer();
// Perform some msaaging and lookups here
customer.setId(item.getId());
customer.setFirstName(item.getFirstName());
customer.setLastName(item.getLastName());
customer.setBirthdate(item.getBirthdate());
customer.setStatus(item.getStatus());
return customer;
}
}
CustomLineAggregator.java
public class CustomLineAggregator implements LineAggregator<Customer> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public String aggregate(Customer item) {
try {
return objectMapper.writeValueAsString(item);
} catch (Exception e) {
throw new RuntimeException("Unable to serialize Customer", e);
}
}
}
Customer.java
#Data
#AllArgsConstructor
#Builder
#NoArgsConstructor
public class Customer {
private Long id;
private String firstName;
private String lastName;
private String birthdate;
private String status;
}
Error-
The method setClassifier(Classifier<? super Customer,ItemProcessor<?,? extends Customer>>) in the type ClassifierCompositeItemProcessor<Customer,Customer> is not applicable for the
arguments (CustomerClassifier)
Configuration
#Configuration
public class JobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource dataSource;
#Bean
public JdbcPagingItemReader<Customer> customerPagingItemReader(){
// reading database records using JDBC in a paging fashion
JdbcPagingItemReader<Customer> reader = new JdbcPagingItemReader<>();
reader.setDataSource(this.dataSource);
reader.setFetchSize(1000);
reader.setRowMapper(new CustomerRowMapper());
// Sort Keys
Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("id", Order.ASCENDING);
// MySQL implementation of a PagingQueryProvider using database specific features.
MySqlPagingQueryProvider queryProvider = new MySqlPagingQueryProvider();
queryProvider.setSelectClause("id, firstName, lastName, birthdate");
queryProvider.setFromClause("from customer");
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#Bean
public EvenCustomerProcessor evenCustomerProcessor() {
return new EvenCustomerProcessor();
}
#Bean
public OddCustomerProcessor oddCustomerProcessor() {
return new OddCustomerProcessor();
}
#Bean
public JdbcBatchItemWriter<Customer> customerItemWriter(){
JdbcBatchItemWriter<Customer> batchItemWriter = new JdbcBatchItemWriter<>();
batchItemWriter.setDataSource(dataSource);
batchItemWriter.setSql(""); // Query Goes here
return batchItemWriter;
}
#Bean
public ClassifierCompositeItemProcessor<Customer, Customer> classifierCustomerCompositeItemProcessor() throws Exception{
ClassifierCompositeItemProcessor<Customer, Customer> itemProcessor = new ClassifierCompositeItemProcessor<>();
itemProcessor.setClassifier(new CustomerClassifier(evenCustomerProcessor(), oddCustomerProcessor()));
}
#Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1")
.<Customer, Customer> chunk(10)
.reader(customerPagingItemReader())
.processor(classifierCustomerCompositeItemProcessor())
.writer(customerItemWriter())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.start(step1())
.build();
}
}
You can remove the CustomerClassifier and define the composite item processor as follows:
#Bean
public ClassifierCompositeItemProcessor<Customer, Customer> classifierCustomerCompositeItemProcessor(
EvenCustomerProcessor evenCustomerProcessor,
OddCustomerProcessor oddCustomerProcessor
) {
ClassifierCompositeItemProcessor<Customer, Customer> itemProcessor = new ClassifierCompositeItemProcessor<>();
itemProcessor.setClassifier(new Classifier<Customer, ItemProcessor<?, ? extends Customer>>() {
#Override
public ItemProcessor<?, ? extends Customer> classify(Customer customer) {
return customer.getStatus().equals("I") ? evenCustomerProcessor : oddCustomerProcessor;
}
});
return itemProcessor;
}
Then update your step definition as follows:
#Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1")
.<Customer, Customer> chunk(10)
.reader(customerPagingItemReader())
.processor(classifierCustomerCompositeItemProcessor(evenCustomerProcessor(), oddCustomerProcessor()))
.writer(customerItemWriter())
.build();
}

Testing Spring Boot Cache(Caffeine)

I have my cache config as below;
#Configuration
public class CacheConfiguration {
#Bean
public CacheManager cacheManager(Ticker ticker) {
CaffeineCache bookCache = buildCache("books", ticker, 30);
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Collections.singletonList(bookCache));
return cacheManager;
}
private CaffeineCache buildCache(String name, Ticker ticker, int minutesToExpire) {
return new CaffeineCache(name, Caffeine.newBuilder()
.expireAfterWrite(minutesToExpire, TimeUnit.MINUTES)
.maximumSize(100)
.ticker(ticker)
.build());
}
#Bean
public Ticker ticker() {
return Ticker.systemTicker();
}
}
And the service I want to test:
#Service
public class TestServiceImpl implements TestService {
private final BookRepository bookRepository; // interface
#Autowired
public TestServiceImpl(final BookRepository bookRepository) {
this.bookRepository = bookRepository;
}
#Override
public Book getByIsbn(String isbn) {
return bookRepository.getByIsbn(isbn);
}
}
The required method in repository is annotated with #Cacheable("books").
#Override
#Cacheable("books")
public Book getByIsbn(String isbn) {
LOGGER.info("Fetching Book...");
simulateSlowService(); // Wait for 5 secs
return new Book(isbn, "Some book");
}
I need to write a test showing the caching works. So I created another ticker bean in test to override the one existing in CacheConfiguration. The code;
#RunWith(SpringRunner.class)
#SpringBootTest
public class TestServiceTests {
private static final String BOOK_ISBN = "isbn-8442";
#SpyBean
private BookRepository bookRepository;
#Autowired
private TestService testService;
#Configuration
#Import(SpringBootCacheApplication.class)
public static class TestConfiguration {
//testCompile('com.google.guava:guava-testlib:23.6-jre')
static FakeTicker fakeTicker = new FakeTicker();
#Bean
public Ticker ticker() {
return fakeTicker::read;
}
}
#Before
public void setUp() {
Book book = fakeBook();
doReturn(book)
.when(bookRepository)
.getByIsbn(BOOK_ISBN);
}
private Book fakeBook() {
return new Book(BOOK_ISBN, "Mock Book");
}
#Test
public void shouldUseCache() {
// Start At 0 Minutes
testService.getByIsbn(BOOK_ISBN);
verify(bookRepository, times(1)).getByIsbn(BOOK_ISBN);
// After 5 minutes from start, it should use cached object
TestConfiguration.fakeTicker.advance(5, TimeUnit.MINUTES);
testService.getByIsbn(BOOK_ISBN);
verify(bookRepository, times(1)).getByIsbn(BOOK_ISBN); // FAILS HERE
// After 35 Minutes from start, it should call the method again
TestConfiguration.fakeTicker.advance(30, TimeUnit.MINUTES);
testService.getByIsbn(BOOK_ISBN);
verify(bookRepository, times(2)).getByIsbn(BOOK_ISBN);
}
}
But it fails at the line marked with //FAILS HERE with message;
org.mockito.exceptions.verification.TooManyActualInvocations:
simpleBookRepository.getByIsbn("isbn-8442");
Wanted 1 time:
-> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
But was 2 times. Undesired invocation:
-> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)`
Why it fails? Shouldn't it use cache? Or my test is wrong?
Any help or pointers are greatly appreciated! :)
verify(bookRepository, times(1)).getByIsbn(BOOK_ISBN); // FAILS HERE
Ofcourse it fails here. because ~4 lines before you already called one times this method. In this check you should put times(2). And on the next checking number of invocations should be times(3)

Resources