Order of processing REST API calls - spring

I have a strage(for me) question to ask. I have created synchronized Service which is called by Controller:
#Controller
public class WebAppApiController {
private final WebAppService webApService;
#Autowired
WebAppApiController(WebAppService webApService){
this.webApService= webApService;
}
#Transactional
#PreAuthorize("hasAuthority('ROLE_API')")
#PostMapping(value = "/api/webapp/{projectId}")
public ResponseEntity<Status> getWebApp(#PathVariable(value = "projectId") Long id, #RequestBody WebAppRequestModel req) {
return webApService.processWebAppRequest(id, req);
}
}
Service layer is just checking if there is no duplicate in request and store it in database. Because client which is using this endpoint is making MANY requests continously it happened that before one request was validated agnist duplicate other the same was put in database - that is why I am trying to do synchronized block.
#Service
public class WebAppService {
private final static String UUID_PATTERN_TO = "[a-zA-Z0-9]{8}-[a-zA-Z0-9]{4}-[a-zA-Z0-9]{4}-[a-zA-Z0-9]{4}-[a-zA-Z0-9]{12}";
private final WebAppRepository waRepository;
#Autowired
public WebAppService(WebAppRepository waRepository){
this.waRepository= waRepository;
}
#Transactional(rollbackOn = Exception.class)
public ResponseEntity<Status> processScanWebAppRequest(Long id, WebAppScanModel webAppScanModel){
try{
synchronized (this){
Optional<WebApp> webApp=verifyForDuplicates(webAppScanModel);
if(!webApp.isPresent()){
WebApp webApp=new WebApp(webAppScanModel.getUrl())
webApp=waRepository.save(webApp);
processpropertiesOfWebApp(webApp);
return new ResonseEntity<>(HttpStatus.CREATED);
}
return new ResonseEntity<>(HttpStatus.CONFLICT);
}
} catch (NonUniqueResultException ex){
return new ResponseEntity<>(HttpStatus.PRECONDITION_FAILED);
} catch (IncorrectResultSizeDataAccessException ex){
return new ResponseEntity<>(HttpStatus.PRECONDITION_FAILED);
}
}
}
Optional<WebApp> verifyForDuplicates(WebAppScanModel webAppScanModel){
return waRepository.getWebAppByRegex(webAppScanModel.getUrl().replaceAll(UUID_PATTERN_TO,UUID_PATTERN_TO)+"$");
}
And JPA method:
#Query(value="select * from webapp wa where wa.url ~ :url", nativeQuery = true)
Optional<WebApp> getWebAppByRegex(#Param("url") String url);
processpropertiesOfWebApp method is doing further processing for given webapp which at this point should be unique.
Intended behaviour is:
when client post request contains multiple urls like:
https://testdomain.com/user/7e1c44e4-821b-4d05-bdc3-ebd43dfeae5f
https://testdomain.com/user/d398316e-fd60-45a3-b036-6d55049b44d8
https://testdomain.com/user/c604b551-101f-44c4-9eeb-d9adca2b2fe9
Only first one will be stored within database but at this moment this is not what is happening. Select from my database:
select inserted,url from webapp where url ~ 'https://testdomain.com/users/[a-zA-Z0-9]{8}-[a-zA-Z0-9]{4}-[a-zA-Z0-9]{4}-[a-zA-Z0-9]{4}-[a-zA-Z0-9]{12}$';
2019-11-07 08:53:05 | https://testdomain.com/users/d398316e-fd60-45a3-b036-6d55049b44d8
2019-11-07 08:53:05 | https://testdomain.com/users/d398316e-fd60-45a3-b036-6d55049b44d8
2019-11-07 08:53:05 | https://testdomain.com/users/d398316e-fd60-45a3-b036-6d55049b44d8
(3 rows)
I will try to add unique constraint on url column but I can't imagine this will solve the problem while when UUID changes new url will be unique
Could anyone give me a hint what I am doing wrong?
Question is related with the one I asked before but not found proper solution, so I simplified my method but still no success

Related

Webflux Controller 'return Object instead of Mono'

Hello I am new to Webflux I follow a tutorial for building reactive microservices. In my project I faced the following problem.
I want to create a crud api for the product service and the following is the Create method
#Override
public Product createProduct(Product product) {
Optional<ProductEntity> productEntity = Optional.ofNullable(repository.findByProductId(product.getProductId()).block());
productEntity.ifPresent((prod -> {
throw new InvalidInputException("Duplicate key, Product Id: " + product.getProductId());
}));
ProductEntity entity = mapper.apiToEntity(product);
Mono<Product> newProduct = repository.save(entity)
.log()
.map(mapper::entityToApi);
return newProduct.block();
}
The problem is that when I call this method from postman I get the error
"block()/blockFirst()/blockLast() are blocking, which is not supported in thread reactor-http-nio-3" but when I use a StreamListener this call works ok. The stream Listener gets events from a rabbit-mq channel
StreamListener
#EnableBinding(Sink.class)
public class MessageProcessor {
private final ProductService productService;
public MessageProcessor(ProductService productService) {
this.productService = productService;
}
#StreamListener(target = Sink.INPUT)
public void process(Event<Integer, Product> event) {
switch (event.getEventType()) {
case CREATE:
Product product = event.getData();
LOG.info("Create product with ID: {}", product.getProductId());
productService.createProduct(product);
break;
default:
String errorMessage = "Incorrect event type: " + event.getEventType() + ", expected a CREATE or DELETE event";
LOG.warn(errorMessage);
throw new EventProcessingException(errorMessage);
}
}
}
I Have two questions.
Why this works with The StreamListener and not with a simple request?
Is there a proper way in webflux to return the object of the Mono or we always have to return a Mono?
Your create method would want to look more like this and you would want to return a Mono<Product> from your controller rather than the object alone.
public Mono<Product> createProduct(Product product) {
return repository.findByProductId(product.getProductId())
.switchIfEmpty(Mono.just(mapper.apiToEntity(product)))
.flatMap(repository::save)
.map(mapper::entityToApi);
}
As #Thomas commented you are breaking some of the fundamentals of reactive coding and not getting the benefits by using block() and should read up on it more. For example the reactive mongo repository you are using will be returning a Mono which has its own methods for handling if it is empty without needing to use an Optional as shown above.
EDIT to map to error if entity already exists otherwise save
public Mono<Product> createProduct(Product product) {
return repository.findByProductId(product.getProductId())
.hasElement()
.filter(exists -> exists)
.flatMap(exists -> Mono.error(new Exception("my exception")))
.then(Mono.just(mapper.apiToEntity(product)))
.flatMap(repository::save)
.map(mapper::entityToApi);
}

Cache Kafka Records using Caffeine Cache Springboot

I am trying to cache Kafka Records within 3 minutes of interval post that it will get expired and removed from the cache.
Each incoming records which is fetched using kafka consumer written in springboot needs to be updated in cache first then if it is present i need to discard the next duplicate records if it matches the cache record.
I have tried using Caffeine cache as below,
#EnableCaching
public class AppCacheManagerConfig {
#Bean
public CacheManager cacheManager(Ticker ticker) {
CaffeineCache bookCache = buildCache("declineRecords", ticker, 3);
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Collections.singletonList(bookCache));
return cacheManager;
}
private CaffeineCache buildCache(String name, Ticker ticker, int minutesToExpire) {
return new CaffeineCache(name, Caffeine.newBuilder().expireAfterWrite(minutesToExpire, TimeUnit.MINUTES)
.maximumSize(100).ticker(ticker).build());
}
#Bean
public Ticker ticker() {
return Ticker.systemTicker();
}
}
and my Kafka Consumer is as below,
#Autowired
CachingServiceImpl cachingService;
#KafkaListener(topics = "#{'${spring.kafka.consumer.topic}'}", concurrency = "#{'${spring.kafka.consumer.concurrentConsumers}'}", errorHandler = "#{'${spring.kafka.consumer.errorHandler}'}")
public void consume(Message<?> message, Acknowledgment acknowledgment,
#Header(KafkaHeaders.RECEIVED_TIMESTAMP) long createTime) {
logger.info("Recieved Message: " + message.getPayload());
try {
boolean approveTopic = false;
boolean duplicateRecord = false;
if (cachingService.isDuplicateCheck(declineRecord)) {
//do something with records
}
else
{
//do something with records
}
cachingService.putInCache(xmlJSONObj, declineRecord, time);
and my caching service is as below,
#Component
public class CachingServiceImpl {
private static final Logger logger = LoggerFactory.getLogger(CachingServiceImpl.class);
#Autowired
CacheManager cacheManager;
#Cacheable(value = "declineRecords", key = "#declineRecord", sync = true)
public String putInCache(JSONObject xmlJSONObj, String declineRecord, String time) {
logger.info("Record is Cached for 3 minutes interval check", declineRecord);
cacheManager.getCache("declineRecords").put(declineRecord, time);
return declineRecord;
}
public boolean isDuplicateCheck(String declineRecord) {
if (null != cacheManager.getCache("declineRecords").get(declineRecord)) {
return true;
}
return false;
}
}
But Each time a record comes in consumer my cache is always empty. Its not holding the records.
Modifications Done:
I have added Configuration file as below after going through the suggestions and more kind of R&D removed some of the earlier logic and now the caching is working as expected but duplicate check is failing when all the three consumers are sending the same records.
`
#Configuration
public class AppCacheManagerConfig {
public static Cache<String, Object> jsonCache =
Caffeine.newBuilder().expireAfterWrite(3, TimeUnit.MINUTES)
.maximumSize(10000).recordStats().build();
#Bean
public CacheLoader<Object, Object> cacheLoader() {
CacheLoader<Object, Object> cacheLoader = new CacheLoader<Object, Object>() {
#Override
public Object load(Object key) throws Exception {
return null;
}
#Override
public Object reload(Object key, Object oldValue) throws Exception {
return oldValue;
}
};
return cacheLoader;
}
`
Now i am using the above cache as manual put and get.
I guess you're trying to implement records deduplication for Kafka.
Here is the similar discussion:
https://github.com/spring-projects/spring-kafka/issues/80
Here is the current abstract class which you may extend to achieve the necessary result:
https://github.com/spring-projects/spring-kafka/blob/master/spring-kafka/src/main/java/org/springframework/kafka/listener/adapter/AbstractFilteringMessageListener.java
Your caching service is definitely incorrect: Cacheable annotation allows marking the data getters and setters, to add caching through AOP. While in the code you clearly implement some low-level cache updating logic of your own.
At least next possible changes may help you:
Remove #Cacheable. You don't need it because you work with cache manually, so it may be the source of conflicts (especially as soon as you use sync = true). If it helps, remove #EnableCaching as well - it enables support for cache-related Spring annotations which you don't need here.
Try removing Ticker bean with the appropriate parameters for other beans. It should not be harmful as per your configuration, but usually it's helpful only for tests, no need to define it otherwise.
Double-check what is declineRecord. If it's a serialized object, ensure that serialization works properly.
Add recordStats() for cache and output stats() to log for further analysis.

Getting a fixed number of records using Spring Data JPA

I am trying to create a Spring Boot application, where I need to fetch the records from the database and make a call to the REST API for each record fetched from the database. But instead of fetching all the records at once I want to retrieve in batch sizes, say 10, make the rest call for them and then fetch another 10 and do the same, until last record. I am using spring-data-jpa. How can I achieve that?
P.S.: Its a multi-threaded call and DB is the Amazon DynamoDB
My code till now:
Controller:
//Multi Threaded Call to rest
#GetMapping("/myApp-multithread")
public String getQuoteOnSepThread() throws InterruptedException, ExecutionException {
System.out.println("#################################################Multi Threaded Post Call######################");
ExecutorService executor= Executors.newFixedThreadPool(10);
List<Future<String>> myFutureList= new ArrayList<Future<String>>();
long startTime=System.currentTimeMillis()/1000;
***//Here instead of fetching and calling Mycallable for each one, I want
// to do it in batches of 10***
Iterable<Customer> customerIterable=repo.findAll();
List<Customer> customers=new ArrayList<Customer>();
customerIterable.forEach(customers::add);
for(Customer c:customers) {
MyCallable myCallable= new MyCallable(restTemplate, c);
Future<String> future= executor.submit(myCallable);
myFutureList.add(future);
}
for(Future<String> fut:myFutureList) {
fut.get();
}
executor.shutdown();
long timeElapsed= (System.currentTimeMillis()/1000)-startTime;
System.out.println("->>>>>>>>>>>>>>>Time Elapsed In Multi Threaded Post Call<<<<<<<<<<<<<<<-"+timeElapsed);
return "Success";
}
My Callable:
#Scope("prototype")
#Component
public class MyCallable implements Callable<String>{
private RestTemplate restTemplate;
private Customer c;
public MyCallable(RestTemplate rt, Customer cust) {
this.restTemplate = rt;
this.c = cust;
}
#Override
public String call() throws Exception {
System.out.println("Customer no"+ c.getId() +"On thread Number"+Thread.currentThread().getId());
restTemplate.postForObject("http://localhost:3000/save", c, String.class);
return "Done";
}
}
How can I achieve that?
Spring Data JPA offers Page and Slice to handle this case (see PagingAndSortingRepository and Pageable)
public interface PagingAndSortingRepository<T, ID extends Serializable>
extends CrudRepository<T, ID> {
Iterable<T> findAll(Sort sort);
Page<T> findAll(Pageable pageable);
}
You can create Pageable request as:
Pageable firstPageWithTwoElements = PageRequest.of(0, 2);
and pass it to your custom repository (which should extend PagingAndSortingRepository):
Page<T> pageResult = customRepository.findAll(firstPageWithTwoElements);

Spring Transactional method not working properly (not saving db)

I have spent day after day trying to find a solution for my problem with Transactional methods. The logic is like this:
Controller receive request, call queueService, put it in a PriorityBlockingQueue and another thread process the data (find cards, update status,assign to current game, return data)
Controller:
#RequestMapping("/queue")
public DeferredResult<List<Card>> queueRequest(#Params...){
queueService.put(result, size, terminal, time)
result.onCompletion(() -> assignmentService.assignCards(result, game,room, cliente));
}
QueueService:
#Service
public class QueueService {
private BlockingQueue<RequestQueue> queue = new PriorityBlockingQueue<>();
#Autowired
GameRepository gameRepository;
#Autowired
TerminalRepository terminalRepository;
#Autowired
RoomRpository roomRepository;
private long requestId = 0;
public void put(DeferredResult<List<Card>> result, int size, String client, LocalDateTime time_order){
requestId++;
--ommited code(find Entity: game, terminal, room)
try {
RequestQueue request= new RequestCola(requestId, size, terminal,time_order, result);
queue.put(request);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
CardService:
#Transactional
public class CardService {
#Autowired
EntityManager em;
#Autowired
CardRepository cardRepository;
#Autowired
AsignService asignacionService;
public List<Cards> processRequest(int size, BigDecimal value)
{
List<Card> carton_query = em.createNativeQuery("{call cards_available(?,?,?)}",
Card.class)
.setParameter(1, false)
.setParameter(2, value)
.setParameter(3, size).getResultList();
List<String> ids = new ArrayList<String>();
carton_query.forEach(action -> ids.add(action.getId_card()));
String update_query = "UPDATE card SET available=true WHERE id_card IN :ids";
em.createNativeQuery(update_query).setParameter("ids", ids).executeUpdate();
return card_query;
}
QueueExecutor (Consumer)
#Component
public class QueueExecute {
#Autowired
QueueService queueRequest;
#Autowired
AsignService asignService;
#Autowired
CardService cardService;
#PostConstruct
public void init(){
new Thread(this::execute).start();
}
private void execute(){
while (true){
try {
RequestQueue request;
request = queueRequest.take();
if(request != null) {
List<Card> cards = cardService.processRequest(request.getSize(), new BigDecimal("1.0"));
request.getCards().setResult((ArrayList<Card>) cards);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
AssignService:
#Transactional
public void assignCards(DeferredResult<List<Card>> cards, Game game, Room room, Terminal terminal)
{
game = em.merge(game);
room = em.merge(room);
terminal = em.merge(terminal);
Order order = new Order();
LocalDateTime datetime = LocalDateTime.now();
BigDecimal total = new BigDecimal("0.0");
order.setTime(datetime)
order.setRoom(room);
order.setGame(game);
order.setId_terminal(terminal);
for(Card card: (List<Card>)cards.getResult()) {
card= em.merge(card)
--> System.out.println("CARD STATUS" + card.getStatus());
// This shows the OLD value of the Card (not updated)
card.setOrder(order);
order.getOrder().add(card);
}
game.setOrder(order);
//gameRepository.save(game)
}
With this code, it does not save new Card status on DB but Game, Terminal and Room saves ok on DB (more or less...). If I remove the assignService, CardService saves the new status on DB correctly.
I have tried to flush manually, save with repo and so on... but the result is almost the same. Could anybody help me?
I think I found a solution (probably not the optimum), but it's more related to the logic of my program.
One of the main problems was the update of Card status property, because it was not reflected on the entity object. When the assignOrder method is called it received the old Card value because it's not possible to share information within Threads/Transactions (as far I know). This is normal within transactions because em.executeUpdate() only commits database, so if I want to get the updated entity I need to refresh it with em.refresh(Entity), but this caused performance to go down.
At the end I changed the logic: first create Orders (transactional) and then assign cards to the orders (transactional). This way works correctly.

DeferredResult in spring mvc

I have one class that extends DeferredResults and extends Runnable as shown below
public class EventDeferredObject<T> extends DeferredResult<Boolean> implements Runnable {
private Long customerId;
private String email;
#Override
public void run() {
RestTemplate restTemplate=new RestTemplate();
EmailMessageDTO emailMessageDTO=new EmailMessageDTO("dineshshe#gmail.com", "Hi There");
Boolean result=restTemplate.postForObject("http://localhost:9080/asycn/sendEmail", emailMessageDTO, Boolean.class);
this.setResult(result);
}
//Constructor and getter and setters
}
Now I have controller that return the object of the above class,whenever new request comes to controller we check if that request is present in HashMap(That stores unprocessed request at that instance).If not present then we are creating object of EventDeferredObject class can store that in HashMap and call start() method on it.If this type request is already present then we will return that from HashMap.On completion on request we will delete that request from HashMap.
#RequestMapping(value="/sendVerificationDetails")
public class SendVerificationDetailsController {
private ConcurrentMap<String , EventDeferredObject<Boolean>> requestMap=new ConcurrentHashMap<String , EventDeferredObject<Boolean>>();
#RequestMapping(value="/sendEmail",method=RequestMethod.POST)
public EventDeferredObject<Boolean> sendEmail(#RequestBody EmailDTO emailDTO)
{
EventDeferredObject<Boolean> eventDeferredObject = null;
System.out.println("Size:"+requestMap.size());
if(!requestMap.containsKey(emailDTO.getEmail()))
{
eventDeferredObject=new EventDeferredObject<Boolean>(emailDTO.getCustomerId(), emailDTO.getEmail());
requestMap.put(emailDTO.getEmail(), eventDeferredObject);
Thread t1=new Thread(eventDeferredObject);
t1.start();
}
else
{
eventDeferredObject=requestMap.get(emailDTO.getEmail());
}
eventDeferredObject.onCompletion(new Runnable() {
#Override
public void run() {
if(requestMap.containsKey(emailDTO.getEmail()))
{
requestMap.remove(emailDTO.getEmail());
}
}
});
return eventDeferredObject;
}
}
Now this code works fine if there no identical request comes to that stored in HashMap. If we give number of different request at same time code works fine.
Well, I do not know if I understood correctly, but I think you might have race conditions in the code, for example here:
if(!requestMap.containsKey(emailDTO.getEmail()))
{
eventDeferredObject=new EventDeferredObject<Boolean>(emailDTO.getCustomerId(), emailDTO.getEmail());
requestMap.put(emailDTO.getEmail(), eventDeferredObject);
Thread t1=new Thread(eventDeferredObject);
t1.start();
}
else
{
eventDeferredObject=requestMap.get(emailDTO.getEmail());
}
think of a scenario in which you have two requests with the same key emailDTO.getEmail().
Request 1 checks if there is a key in the map, does not find it and puts it inside.
Request 2 comes some time later, checks if there is a key in the map, finds it, and
goes to fetch it; however just before that, the thread started by request 1 finishes and another thread, started by onComplete event, removes the key from the map. At this point,
requestMap.get(emailDTO.getEmail())
will return null, and as a result you will have a NullPointerException.
Now, this does look like a rare scenario, so I do not know if this is the problem you see.
I would try to modify the code as follows (I did not run it myself, so I might have errors):
public class EventDeferredObject<T> extends DeferredResult<Boolean> implements Runnable {
private Long customerId;
private String email;
private ConcurrentMap ourConcurrentMap;
#Override
public void run() {
...
this.setResult(result);
ourConcurrentMap.remove(this.email);
}
//Constructor and getter and setters
}
so the DeferredResult implementation has the responsibility to remove itself from the concurrent map. Moreover I do not use the onComplete to set a callback thread, as it seems to me an unnecessary complication. To avoid the race conditions I talked about before, one needs to combine somehow the verification of the presence of an entry with its fetching into one atomic operation; this is done by the putIfAbsent method of ConcurrentMap. Therefore I change the controller into
#RequestMapping(value="/sendVerificationDetails")
public class SendVerificationDetailsController {
private ConcurrentMap<String , EventDeferredObject<Boolean>> requestMap=new ConcurrentHashMap<String , EventDeferredObject<Boolean>>();
#RequestMapping(value="/sendEmail",method=RequestMethod.POST)
public EventDeferredObject<Boolean> sendEmail(#RequestBody EmailDTO emailDTO)
{
EventDeferredObject<Boolean> eventDeferredObject = new EventDeferredObject<Boolean>(emailDTO.getCustomerId(), emailDTO.getEmail(), requestMap);
EventDeferredObject<Boolean> oldEventDeferredObject = requestMap.putIfAbsent(emailDTO.getEmail(), eventDeferredObject );
if(oldEventDeferredObject == null)
{
//if no value was present before
Thread t1=new Thread(eventDeferredObject);
t1.start();
return eventDeferredObject;
}
else
{
return oldEventDeferredObject;
}
}
}
if this does not solve the problem you have, I hope that at least it might give some idea.

Resources