the 2nd step can not fetch data inserted by 1st step - spring

I configure 2 steps in my job.
The first job is to read from a csv file and writer it to db
below it he code of writer
public class TempTableWriter implements ItemWriter<Module> {
#Autowired
DataSource ds;
#Autowired
ModuleService moduleService;
#Override
public void write(List<? extends Module> items) throws Exception {
// TODO Auto-generated method stub
moduleService.updateModuleList(items);
}
}
#Service
#Transactional(propagation = Propagation.REQUIRED)
public class ModuleService {
#Autowired
ModuleDao moduleDao;
#Transactional(isolation = Isolation.READ_UNCOMMITTED)
public List<Module> getModuleList() {
return moduleDao.getModuleList();
}
public void updateModuleList(List<? extends Module> items) {
items.forEach(item -> {
moduleDao.updateModuleList(p1, p2, p3, p4);
});
}
}
#Repository
public class ModuleDao {
#Autowired
#Qualifier("moduleMapper")
private RowMapper moduleMapper;
#Autowired
private JdbcTemplate jdbctemplate;
public List<Module> getModuleList() {
return jdbctemplate.query("select * from[schema].[t1] ORDER BY p1",
moduleMapper);
}
public void updateModuleList(String requestId, String fileName,
String orcStatus,String context) {
jdbctemplate.update("insert into [schema].[t1] values(?,?,?,?)", p1, p2,
p3,p4);
}
}
then on the 2nd step when I try the fetch the data which I stored by step1, it always get null, but after the job completing,I could find them really stored in db.
Similarly I tried to use JdbcItemWriter to save the records but I still can not fetch them by the next step.

Related

Point cut is not triggered when AspectJExpressionPointcutAdvisor create programatically

I am creating AspectJExpressionPointcutAdvisor based on number of pointcut present in application properties file .It's creating object without error but pointcut are not triggered.
Note: Need to create bean dynamically based on number of pointcut expression in properties file (varies).
Application properties file
pointcut.expression.projectUpdate[0]= execution(* com.abc.app.service.impl.TestServiceImpl.updateProjectDetails(..))
pointcut.expression.projectUpdate[1]= execution(* com.abc.app.service.impl.TestServiceImpl.cancelProject(..))
pointcut.expression.projectUpdate[2]= execution(* com.abc.app.service.impl.TestCSATRatingServiceImpl.saveRatingDetails(..))
TestConfig.class
#Configuration
public class TestConfig implements BeanFactoryAware {
#Autowired
private PointcutExprProperties pcExprProp;
#Autowired(required=false)
private ProjectUpdateAspect projectUpdateAdvice;
private BeanFactory beanFactory;
#Override
public void setBeanFactory(BeanFactory beanFactory) {
this.beanFactory = beanFactory;
}
#PostConstruct
public void configure() {
ConfigurableBeanFactory configurableBeanFactory = (ConfigurableBeanFactory) beanFactory;
int i=1;
for(String pointCut : pcExprProp.getProjectUpdate()) {
AspectJExpressionPointcutAdvisor projectUpdateAdvisor = new AspectJExpressionPointcutAdvisor();
projectUpdateAdvisor.setExpression(pointCut);
projectUpdateAdvisor.setAdvice(projectUpdateAdvice);
configurableBeanFactory.registerSingleton("beanName_"+i, projectUpdateAdvisor);
i++;
}
}
}
ProjectUpdateAspect.class
#Component
#Aspect
public class ProjectUpdateAspect implements AfterReturningAdvice {
private static final Logger log = LoggerFactory.getLogger(ProjectUpdateAspect.class);
#Override
public void afterReturning(Object returnValue, Method method, Object[] args, Object target) throws Throwable {
try {
// some thing
}catch (Exception exception) {
log.error("Error while processing ProjectUpdateAspect",exception);
}
}
}
PointcutExprProperties
#Configuration
#ConfigurationProperties(prefix = "pointcut.expression")
#Validated
public class PointcutExprProperties {
#NotNull
private List<String> projectCreate;
#NotNull
private List<String> projectUpdate;
public List<String> getProjectCreate() {
return projectCreate;
}
public void setProjectCreate(List<String> projectCreate) {
this.projectCreate = projectCreate;
}
public List<String> getProjectUpdate() {
return projectUpdate;
}
public void setProjectUpdate(List<String> projectUpdate) {
this.projectUpdate = projectUpdate;
}
}
Please suggest me how to get rid of this issue.
I suggest you do it like this:
You do not define your "aspect" as #Component #Aspect but make it implement MethodInterceptor.
You create AspectJExpressionPointcut with the value from your properties file.
You register a DefaultPointcutAdvisor (configured with your pointcut and interceptor) as a bean.
See also my answer here (update 3) and my GitHub sample repository which I just updated for you in order to include reading the pointcut from application.properties.

HazelcastRepository - how to save a new entity (with the id from sequence) and put it to the map

I would like to save a new entity using HazlecastRepository.
When the id is null, the KeyValueTemplate use SecureRandom and generate id which is like -123123123123123123.
I don't want to save id like that, instead of that i woud like to get it from sequence in db and put it to the map.
I have found 2 solutions:
1) In AdminService get the next value from sequence in database and set it
2) Create atomic counter id in the Hazelcast server and init it with the current value from the sequence. In AdminService get counter, increment value and set id.
but they are not very pretty.
Do you have any other ideas?
The code:
#Configuration
#EnableHazelcastRepositories(basePackages = "com.test")
public class HazelcastConfig {
#Bean
public HazelcastInstance hazelcastInstance(ClientConfig clientConfig) {
return HazelcastClient.newHazelcastClient(clientConfig);
}
#Bean
#Qualifier("client")
public ClientConfig clientConfig() {
ClientConfig clientConfig = new ClientConfig();
clientConfig.setClassLoader(HazelcastConfig.class.getClassLoader());
ClientNetworkConfig networkConfig = clientConfig.getNetworkConfig();
networkConfig.addAddress("127.0.0.1:5701");
networkConfig.setConnectionAttemptLimit(20);
return clientConfig;
}
#Bean
public KeyValueTemplate keyValueTemplate(ClientConfig clientConfig) {
return new KeyValueTemplate(new HazelcastKeyValueAdapter(hazelcastInstance(clientConfig)));
}
}
#Service
#RequiredArgsConstructor
public class AdminService {
private final UserRepository userRepository;
...
#Transactional
public User addOrUpdateUser(UserUpdateDto dto) {
validate(dto);
User user = dto.getId() != null ? userService.getUser(dto.getId()) : new User();
mapUser(user, dto);
return userRepository.save(user);
}
...
}
#Repository
public interface UserRepository extends HazelcastRepository<User, Long> {
}
#KeySpace("users")
#Entity
#Table(name = "users)
#Data
#AllArgsConstructor
#NoArgsConstructor
public class User extends DateAudit implements Serializable {
#javax.persistence.Id
#org.springframework.data.annotation.Id
// #GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "user_generator")
// #SequenceGenerator(name="user_generator", sequenceName = "user_seq")
private Long id;
...
}
Hazelcast server:
#Component
#Slf4j
public class UserLoader implements ApplicationContextAware, MapStore<Long, User> {
private static UserJpaRepository userJpaRepository;
#Override
public User load(Long key) {
log.info("load({})", key);
return userJpaRepository.findById(key).orElse(null);
}
#Override
public Map<Long, User> loadAll(Collection<Long> keys) {
Map<Long, User> result = new HashMap<>();
for (Long key : keys) {
User User = this.load(key);
if (User != null) {
result.put(key, User);
}
}
return result;
}
#Override
public Iterable<Long> loadAllKeys() {
return userJpaRepository.findAllId();
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
userJpaRepository = applicationContext.getBean(UserJpaRepository.class);
}
#Override
public void store(Long aLong, User user) {
userJpaRepository.save(user);
}
#Override
public void storeAll(Map<Long, User> map) {
for (Map.Entry<Long, User> mapEntry : map.entrySet()) {
store(mapEntry.getKey(), mapEntry.getValue());
}
}
#Override
public void delete(Long aLong) {
userJpaRepository.deleteById(aLong);
}
#Override
public void deleteAll(Collection<Long> collection) {
collection.forEach(this::delete);
}
}
public interface UserJpaRepository extends CrudRepository<User, Long> {
#Query("SELECT u.id FROM User u")
Iterable<Long> findAllId();
}
I think that there is no better way than what you described.
I'd go with the second solution, because then you're at least coupled to Hazelcast server only.

MongoTemplate null pointer exception in class

I have been looking at other answers but none of the seem to work for me, I have a spring boot application where I am using mongo and kafka. In the main class where my run() method is I am able to #Autowired mongoTemplate and it works but then in another class I did the same and I am getting a null pointer exception on the mongoTemplate.
Here are both classes:
Working
#SpringBootApplication
public class ProducerConsumerApplication implements CommandLineRunner {
public static Logger logger = LoggerFactory.getLogger(ProducerConsumerApplication.class);
public static void main(String[] args) {
SpringApplication.run(ProducerConsumerApplication.class, args).close();
}
#Autowired
private Sender sender;
#Autowired
MongoTemplate mongoTemplate;
#Override
public void run(String... strings) throws Exception {
Message msg = new Message();
msg.setCurrentNode("my_node");
msg.setStartTime(System.currentTimeMillis());
String json = "{ \"color\" : \"Orange\", \"type\" : \"BMW\" }";
ObjectMapper objectMapper = new ObjectMapper();
msg.setTest(objectMapper.readValue(json, new TypeReference<Map<String,Object>>(){}));
sender.send(msg);
mongoTemplate.createCollection("test123");
mongoTemplate.dropCollection("test123");
}
Not working
#Component
public class ParentNode extends Node{
#Autowired
public MongoTemplate mongoTemplate;
public void execute(Message message) {
try{
// GET WORKFLOWS COLLECTION
MongoCollection<Document> collection = mongoTemplate.getCollection("workflows");
} catch(Exception e){
e.printStackTrace();
}
}
}
Thank you for the help. It is much appreciated.
can you try to inject dependency with setter or constructor:
Method 1:
#Component
public class ParentNode extends Node{
#Autowired
public ParentNode(MongoTemplate mongoTemplate){
this.mongoTemplate = mongoTemplate;
}
private final MongoTemplate mongoTemplate;
public void execute(Message message) {
try{
// GET WORKFLOWS COLLECTION
MongoCollection<Document> collection = mongoTemplate.getCollection("workflows");
} catch(Exception e){
e.printStackTrace();
}
}
Method 2:
#Component
public class ParentNode extends Node{
#Autowired
public void setMongoTemplate(MongoTemplate mongoTemplate){
ParentNode.mongoTemplate = mongoTemplate;
}
static private MongoTemplate mongoTemplate;
public void execute(Message message) {
try{
// GET WORKFLOWS COLLECTION
MongoCollection<Document> collection = mongoTemplate.getCollection("workflows");
} catch(Exception e){
e.printStackTrace();
}
}

Testing Spring Boot Cache(Caffeine)

I have my cache config as below;
#Configuration
public class CacheConfiguration {
#Bean
public CacheManager cacheManager(Ticker ticker) {
CaffeineCache bookCache = buildCache("books", ticker, 30);
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Collections.singletonList(bookCache));
return cacheManager;
}
private CaffeineCache buildCache(String name, Ticker ticker, int minutesToExpire) {
return new CaffeineCache(name, Caffeine.newBuilder()
.expireAfterWrite(minutesToExpire, TimeUnit.MINUTES)
.maximumSize(100)
.ticker(ticker)
.build());
}
#Bean
public Ticker ticker() {
return Ticker.systemTicker();
}
}
And the service I want to test:
#Service
public class TestServiceImpl implements TestService {
private final BookRepository bookRepository; // interface
#Autowired
public TestServiceImpl(final BookRepository bookRepository) {
this.bookRepository = bookRepository;
}
#Override
public Book getByIsbn(String isbn) {
return bookRepository.getByIsbn(isbn);
}
}
The required method in repository is annotated with #Cacheable("books").
#Override
#Cacheable("books")
public Book getByIsbn(String isbn) {
LOGGER.info("Fetching Book...");
simulateSlowService(); // Wait for 5 secs
return new Book(isbn, "Some book");
}
I need to write a test showing the caching works. So I created another ticker bean in test to override the one existing in CacheConfiguration. The code;
#RunWith(SpringRunner.class)
#SpringBootTest
public class TestServiceTests {
private static final String BOOK_ISBN = "isbn-8442";
#SpyBean
private BookRepository bookRepository;
#Autowired
private TestService testService;
#Configuration
#Import(SpringBootCacheApplication.class)
public static class TestConfiguration {
//testCompile('com.google.guava:guava-testlib:23.6-jre')
static FakeTicker fakeTicker = new FakeTicker();
#Bean
public Ticker ticker() {
return fakeTicker::read;
}
}
#Before
public void setUp() {
Book book = fakeBook();
doReturn(book)
.when(bookRepository)
.getByIsbn(BOOK_ISBN);
}
private Book fakeBook() {
return new Book(BOOK_ISBN, "Mock Book");
}
#Test
public void shouldUseCache() {
// Start At 0 Minutes
testService.getByIsbn(BOOK_ISBN);
verify(bookRepository, times(1)).getByIsbn(BOOK_ISBN);
// After 5 minutes from start, it should use cached object
TestConfiguration.fakeTicker.advance(5, TimeUnit.MINUTES);
testService.getByIsbn(BOOK_ISBN);
verify(bookRepository, times(1)).getByIsbn(BOOK_ISBN); // FAILS HERE
// After 35 Minutes from start, it should call the method again
TestConfiguration.fakeTicker.advance(30, TimeUnit.MINUTES);
testService.getByIsbn(BOOK_ISBN);
verify(bookRepository, times(2)).getByIsbn(BOOK_ISBN);
}
}
But it fails at the line marked with //FAILS HERE with message;
org.mockito.exceptions.verification.TooManyActualInvocations:
simpleBookRepository.getByIsbn("isbn-8442");
Wanted 1 time:
-> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
But was 2 times. Undesired invocation:
-> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)`
Why it fails? Shouldn't it use cache? Or my test is wrong?
Any help or pointers are greatly appreciated! :)
verify(bookRepository, times(1)).getByIsbn(BOOK_ISBN); // FAILS HERE
Ofcourse it fails here. because ~4 lines before you already called one times this method. In this check you should put times(2). And on the next checking number of invocations should be times(3)

Why the intemReader is always sending the exact same value to CustomItemProcessor

Why does the itemReader method is always sending the exact same file name to be processed in CustomItemProcessor?
As far as I understand, since I settup reader as #Scope and I set more than 1 in chunk, I was expecting the "return s" to move forward to next value from String array.
Let me clarify my question with a debug example in reader method:
1 - the variable stringArray is filled in with 3 file names (f1.txt, f2.txt and f3.txt)
2 - "return s" is evoked with s = f1.txt
3 - "return s" evoked again before evoked customItemProcessor method (perfect untill here since chunk = 2)
4 - looking at s it contains f1.txt again (different from what I expected. I expected f2.txt)
5 and 6 - runs processor with same name f1.tx (it should work correctly if the second turn of "return s" would contain f2.txt)
7 - writer method works as expected (processedFiles contain twice the two names processed in customItemProcessor f1.txt and f1.txt again since same name was processed twice)
CustomItemReader
public class CustomItemReader implements ItemReader<String> {
#Override
public String read() throws Exception, UnexpectedInputException,
ParseException, NonTransientResourceException {
String[] stringArray;
try (Stream<Path> stream = Files.list(Paths.get(env
.getProperty("my.path")))) {
stringArray = stream.map(String::valueOf)
.filter(path -> path.endsWith("out"))
.toArray(size -> new String[size]);
}
//*** the problem is here
//every turn s variable receives the first file name from the stringArray
if (stringArray.length > 0) {
for (String s : stringArray) {
return s;
}
} else {
log.info("read method - no file found");
return null;
}
return null;
}
CustomItemProcessor
public class CustomItemProcessor implements ItemProcessor<String , String> {
#Override
public String process(String singleFileToProcess) throws Exception {
log.info("process method: " + singleFileToProcess);
return singleFileToProcess;
}
}
CustomItemWriter
public class CustomItemWriter implements ItemWriter<String> {
private static final Logger log = LoggerFactory
.getLogger(CustomItemWriter.class);
#Override
public void write(List<? extends String> processedFiles) throws Exception {
processedFiles.stream().forEach(
processedFile -> log.info("**** write method"
+ processedFile.toString()));
FileSystem fs = FileSystems.getDefault();
for (String s : processedFiles) {
Files.deleteIfExists(fs.getPath(s));
}
}
Configuration
#Configuration
#ComponentScan(...
#EnableBatchProcessing
#EnableScheduling
#PropertySource(...
public class BatchConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private JobRepository jobRepository;
#Bean
public TaskExecutor getTaskExecutor() {
return new TaskExecutor() {
#Override
public void execute(Runnable task) {
}
};
}
//I can see the number in chunk reflects how many time customReader is triggered before triggers customProcesser
#Bean
public Step step1(ItemReader<String> reader,
ItemProcessor<String, String> processor, ItemWriter<String> writer) {
return stepBuilderFactory.get("step1").<String, String> chunk(2)
.reader(reader).processor(processor).writer(writer)
.allowStartIfComplete(true).build();
}
#Bean
#Scope
public ItemReader<String> reader() {
return new CustomItemReader();
}
#Bean
public ItemProcessor<String, String> processor() {
return new CustomItemProcessor();
}
#Bean
public ItemWriter<String> writer() {
return new CustomItemWriter();
}
#Bean
public Job job(Step step1) throws Exception {
return jobBuilderFactory.get("job1").incrementer(new RunIdIncrementer()).start(step1).build();
}
Scheduler
#Component
public class QueueScheduler {
private static final Logger log = LoggerFactory
.getLogger(QueueScheduler.class);
private Job job;
private JobLauncher jobLauncher;
#Autowired
public QueueScheduler(JobLauncher jobLauncher, #Qualifier("job") Job job){
this.job = job;
this.jobLauncher = jobLauncher;
}
#Scheduled(fixedRate=60000)
public void runJob(){
try{
jobLauncher.run(job, new JobParameters());
}catch(Exception ex){
log.info(ex.getMessage());
}
}
}
Your issue is that you are relying on an internal loop to iterate over the items instead of letting Spring Batch do it for you by calling ItemReader#read multiple times.
What I'd recommend is changing your reader to the something like the following:
public class JimsItemReader implements ItemStreamReader {
private String[] items;
private int curIndex = -1;
#Override
public void open(ExecutionContext ec) {
curIndex = ec.getInt("curIndex", -1);
String[] stringArray;
try (Stream<Path> stream = Files.list(Paths.get(env.getProperty("my.path")))) {
stringArray = stream.map(String::valueOf)
.filter(path -> path.endsWith("out"))
.toArray(size -> new String[size]);
}
}
#Override
public void update(ExecutionContext ec) {
ec.putInt("curIndex", curIndex);
}
#Override
public String read() {
if (curIndex < items.length) {
curIndex++;
return items[curIndex];
} else {
return null;
}
}
}
The above example should loop through the items of your array as they are read. It also should be restartable in that we're storing the index in the ExecutionContext so if the job is restarted after a failure, you'll restart where you left off.

Resources