How to implement caching for API get detail of objects - caching

I have 2 services:
Service A: manage entity A (class A) (CRUD).
Service B: need to get data of instances of class A.
How to implement caching for API get detail of an object A ?
Method 1:
Service A:
Create/Delete/Update into DB, and sync to redis
API get detail of an object: read from DB
Service B: get data from redis first, if cache miss then call API get detail of A object
Method 2:
Service A:
Create/Delete/Update into DB, and sync to redis
API get detail of an object
Get from redis first
If cache miss, get from DB
Service B: always call API get detail of object A

Related

Caching results from calls to microservices

Lets say I have a microservice A that is a rest service, and in this service the user can send a request that does some kind of math calculation based on the parameters sent. This microservice A has to call a microservice B in order to get a calculation and microservice A has two steps, first a validation service that does several validations for the request, included to validate the value of the calculation has to be in some kind of range and two, a service that stores in a table the calculation got from microservice B
So, in order to validate this calculation and save it, I have the next options
In the validation service of microservice A, the microservice A calls microservice B to get the calculation and validates it. Then if the validation is OK, the service of microservice A that stores the calculation calls again to microservice B to get the calculation and stores it in the database table
In the validation service of microservice A, the microservice A calls microservice B to get the calculation and validates it. Then if the validation is OK, the calculation is passed in the request for the service of microservice A that stores the calculation. Doing this way the service of microservice A that stores this calculation, does not have to call again to microservice B to get the calculation
I am using option 1 to separate concerns because I think validation service has just to validate, nothing else, and I mean by this that it doesnt have to "store" in the request the calculation like is done in option 2. But in option 1, you have to call two times to microservice B and I dont know if its a bad design
Thanks

WebFlux - handle each item asynchronously before returning

I am fairly new to WebFlux and I am looking for what seems to be a pretty normal usage pattern. Basically what I have is a Spring Controller which returns a Flux< A > (where A is a row fetched from the DB using R2DBC). I want to do an async operation on each received object (for instance I want to send a push notification for each object, for which I also need to make a call to the DB for the users push token and then send the push). The operations should be done asynchronously, so the API end-users receive their data with no delay. Is there some pattern for this already?

How to send an entire Entity Framework Core context from server to client

I have an ASP.NET Core server application that uses Entity Framework Core to provide data of its SQL server.
There are clients that can consume the data via REST API calls.
At the beginning of the communication, it is required to have all the data on the clients but using the existing REST calls it takes minutes as the context contains thousands of entities.
So I looked for ideas and tried the following.
The best looking idea was some kind of serializing, so I created the following method:
public byte[] GetData()
{
string data = Newtonsoft.Json.JsonConvert.SerializeObject(this.ChangeTracker,
new Newtonsoft.Json.JsonSerializerSettings {
ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore });
return CompressAsGZip(data);
}
The results:
Serializing (and then compressing) the ChangeTracker of the context
Initially the ChangeTracker is empty so I can't do it unless I query all the data
Serializing (and then compressing) the entire DbContext
It has so many objects that after 20%, I got an OutOfMemoryException
Should I create a database backup and send the compressed bak file? I guess I couldn't restore it to anywhere as the client database provider is different (SQLite).
What would be the best way to send all the data to the client as fast as it can be?

Method caching with Spring boot and Hazelcast.How and where do I specify my refresh/reload intervals?

I realise #Cacheable annotation helps me with caching the result of a particular method call and subsequent calls are returned from the cache if there are no changes to arguments etc.
I have a requirement where I'm trying to minimise the number of calls to a db and hence loading the entire table. However,I would like to reload this data say every day just to ensure that my cache is not out of sync with the underlying data on the database.
How can I specify such reload/refresh intervals.
I'm trying to use Spring boot and hazelcast.All the examples I have seen talk about specifying LRU LFU etc policies on the config file for maps etc but nothing at a method level.
I can't go with the LRU/LFU etc eviction policies as I intend to reload the entire table data every x hrs or x days.
Kindly help or point me to any such implementation or docs etc.
Spring #Cacheable doesn't support this kind of policies at method level. See for example the code for CacheableOperation.
If you are using hazelcast as your cache provider for spring, you can explicitly evict elements or load datas by using the corresponding IMap from your HazelcastInstance.

Domain and service methods for banking transaction

I am trying to learn Spring framework and see the many of the examples use domain and service objects (domain-driven design?) but I am not able to really understand how to arrive at them. For example, I am trying to model a simple banking application which has customers,accounts and transactions. Below is the draft model i have:
Domain objects:
Customer
id
userName
password
Account:
id
customerId
balance
Transaction:
id
accountId
amount
transactionDate
transactionType
Service objects:
AccountService:
create(Account)
update(Account)
debit(Account,amount,date,remarks)
credit(Account,amount,date,remarks)
transfer(fromAccount,toAccount,amount,remarks)
findAccountsByCustomerId(customerId)?
CustomerService:
findCustomerByName()
findAccountsByCustomerId(customerId)?
Should the CustomerService or AccountService have the method findAccountsByCustomerId(customerId)?
Which domain/service objects should represent the debit/credit transaction? debit() and credit() methods in Account domain object to be defined or in the service object? I would like to persist the transactions rather than just updating the balance.
Should all the business logic be in Service layer? I see most of the spring examples out there are this way.
Since the idea here is to retrieve Accounts, it should be in
AccountService.
Both methods looks fine in AccountService to me because you're
operating on accounts. If you want to persist the transaction you
could have a TransactionDao handling this for you and you'd be
calling it from your AccountService each time you need it. Doing
both in your AccountService methods will allow you to be
transactional. You don't want to persist the transaction object if
updating the balance raised an exception.
Services are useful when you have business logic which does not
belong in your DAO layer. The latter is supposed to query your
database and give you back appropriate domain objects whereas
services are mostly used to do additional treatments like handling
transactions or DTO mapping.
You should take a look at the official Spring sample app PetClinic to give you an idea.

Resources