Update currency rates in database or file - laravel

i'm using laravel 5.7
i need to get the exchange rates and I want to create a scheduler and run every 5 minutes
Now, do you think it's possible to put the currency information in a config file or save it to a database and, of course, i think the cache could be, right?

In your situation the data will be changed only in 5 minutes. So when you get rates just store in the database and then get new values and cache them and give the user rates from cached information. But when the next 5 minutes left and you get newest rates then you must clear existing cache, store new values into database then get new values from database and store into cache and again give the rates to user from new cached content.
If your database getting grow day to day you must add additional logic to not store millions of records into cache every 5 minutes. I advice not to store dynamic content into the cache. Anyway if you want to cache you can use Redis, Memcached etc. But keep in mind to clear and again store new content into cache.
If you only get daily rates in every 5 minutes and not storing anything into database then you can use logic I have told above without database section. Directly save records into cache and clear when new rates getted.
And also add index to database table to get data more faster.

Related

How would Redis get to know if it has to return cached data or fresh data from DB

Say, I'm Fechting thousands or record using some long runing task from DB and caching it using Redis. Next day somebody have changed few records in DB.
Next time how redis would know that it has to return cached data or again have to revisit that all thousands of records in DB?
How this synchronisation achived?
Redis has no idea whether the data in DB has been updated.
Normally, we use Redis to cache data as follows:
Client checks if the data, e.g. key-value pair, exists in Redis.
If the key exists, client gets the corresponding value from Redis.
Otherwise, it gets data from DB, and sets it to Redis. Also client sets an expiration, say 5 minutes, for the key-value pair in Redis.
Then any subsequent requests for the same key will be served by Redis. Although the data in Redis might be out-of-date.
However, after 5 minutes, this key will be removed from Redis automatically.
Go to step 1.
So in order to keep your data in Redis update-to-date, you can set a short expiration time. However, your DB has to serve lots of requests.
If you want to largely decrease requests to DB, you can set a large expiration time. So that, most of time, Redis can serve the requests with possible staled data.
You should consider carefully about the trade-off between performance and staled data.
Since the source of truth resides on your Database and you push data from this DB to Redis, you always have to update from DB to Redis, at least you create another process to sync data.
My suggestion is just to run a first full update from DB to Redis and then use a synch process which every time you notice update/creation/deletion operation in your database you pull it to Redis.
I don't know which Redis structure are you using to store database records in Redis but I guess it could be a Hash, probably indexed by your table index so the sync operation will be immediate: if a record is created in your database you set a HSET, if deletion HDEL and so on.
You even could omit the first full sync from DB to Redis, and just clean Redis and start the sync process.
If you cannot do the above for some reason you can create a syncher daemon which constantly read data from the database and compare them with the data store in Redis if they are different in somehow you update or if they don't exist in some of both sides you can delete or create the entry in Redis.
My solution is:
When you are updating, deleting or adding new data in database, you should delete all data in redis. In your get route, you should check if data exists. If not, you should store all data to redis from db.
you may use #CacheEvict on any update/delete applied on DB. that would clear up responding values from cache, so next query would get from DB

update ignite cache with time-stamp data

My issue is that how to update cache with new entries from database table?
my cache has my Cassandra table data suppose for till 3 p.m.
till that time user has purchase 3 item so my cache has 3 items entry associated with that user.
But after sometime (say 30min) what if user purchase 2 more item ?
As i have 3 entry in cache it wont query from database, how to get those 2 new entry at time of calculation final bill.
One option i have is to call cache.loadCache(null, null) every 15 min? but some where this is not feasible to call every time?
The better option here is to insert data not directly to Cassandra, but using Ignite. It will give a possibility to have always updated data in the cache without running any additional synchronizations with DB.
But if you will choose to run loadCache each time, you can add a timestamp to your object in DB and implement your own CacheStore, which will have addition method that will load only new data from DB. Here is a link to the documentation, it will help you to implement your own CacheStore.

Redis cache strategy

I'm developing a website to display some kind of articles, just like stackoverflow.
Each article contains title, description, and some frequently changed fields (like view_count).
The website also supports cursor paging(max_id, since_id), filtering(by category, tags).
I want to add a cache layer using Redis. Theres are some choices in my mind:
Use zset to store top 1000 acticle id list. Each filter has one zset, like articles:category:{category_id} articles:tag:{tag_id}. Each zset will be updated when a new article published.
Use hash to store each article, like article:{article_id}
Update view_count directly in cache for every view, and sync to db at some point.
Implement cursor paging using ZRANGEBYSCORE (score is publishing timestamp).
Pros: Never need to expire cache. new article immediately shown.
Cons: Difficult to implement and may be error prone. Need some messaging mechanism like rabbitmq.
Use database do the filtering and paging, only return id list, cache id list in redis, and set some TTL (10 seconds). Still use hash to cache each article, so view_count can be updated immediately.
Pros: Easy to implement, No need to have messaging mechanism. Only id list need to be queried when cache expired.
Cons: Need query database for new id list every 10 seconds, new article will be shown after 10 seconds, but view_count will updated immediately.
Use redis to cache acticle list for each query, serialize it to json, and set TTL.
Pros: Easiest to implement (use spring #Cacheable).
Cons: Cache will expire frequently and need query database again. view_count will only be updated when cache expired.
I don't know witch one is a better option, for performance and stability. Thank you for your help.

Caching expensive SQL query in memory or in the database?

Let me start by describing the scenario. I have an MVC 3 application with SQL Server 2008. In one of the pages we display a list of Products that is returned from the database and is UNIQUE per logged in user.
The SQL query (actually a VIEW) used to return the list of products is VERY expensive.
It is based on very complex business requirements which cannot be changed at this stage.
The database schema cannot be changed or redesigned as it is used by other applications.
There are 50k products and 5k users (each user may have access to 1 up to 50k products).
In order to display the Products page for the logged in user we use:
SELECT TOP X * FROM [VIEW] WHERE UserID = #UserId -- where 'X' is the size of the page
The query above returns a maximum of 50 rows (maximum page size). The WHERE clause restricts the number of rows to a maximum of 50k (products that the user has access to).
The page is taking about 5 to 7 seconds to load and that is exactly the time the SQL query above takes to run in SQL.
Problem:
The user goes to the Products page and very likely uses paging, re-sorts the results, goes to the details page, etc and then goes back to the list. And every time it takes 5-7s to display the results.
That is unacceptable, but at the same time the business team has accepted that the first time the Products page is loaded it can take 5-7s. Therefore, we thought about CACHING.
We now have two options to choose from, the most "obvious" one, at least to me, is using .Net Caching (in memory / in proc). (Please note that Distributed Cache is not allowed at the moment for technical constraints with our provider / hosting partner).
But I'm not very comfortable with this. We could end up with lots of products in memory (when there are 50 or 100 users logged in simultaneously) which could cause other issues on the server, like .Net constantly removing cache items to free up space while our code inserts new items.
The SECOND option:
The main problem here is that it is very EXPENSIVE to generate the User x Product x Access view, so we thought we could create a flat table (or in other words a CACHE of all products x users in the database). This table would be exactly the result of the view.
However the results can change at any time if new products are added, user permissions are changed, etc. So we would need to constantly refresh the table (which could take a few seconds) and this started to get a little bit complex.
Similarly, we though we could implement some sort of Cache Provider and, upon request from a user, we would run the original SQL query and select the products from the view (5-7s, acceptable only once) and save that result in a flat table called ProductUserAccessCache in SQL. Next request, we would get the values from this cached-table (as we could easily identify the results were cached for that particular user) with a fast query without calculations in SQL.
Any time a product was added or a permission changed, we would truncate the cached-table and upon a new request the table would be repopulated for the requested user.
It doesn't seem too complex to me, but what we are doing here basically is creating a NEW cache "provider".
Does any one have any experience with this kind of issue?
Would it be better to use .Net Caching (in proc)?
Any suggestions?
We were facing a similar issue some time ago, and we were thinking of using EF caching in order to avoid the delay on retrieving the information. Our problem was a 1 - 2 secs. delay. Here is some info that might help on how to cache a table extending EF. One of the drawbacks of caching is how fresh you need the information to be, so you set your cache expiration accordingly. Depending on that expiration, users might need to wait to get the fresh info more than they would like to, but if your users can accept that they migth be seing outdated info in order to avoid the delay, then the tradeoff would worth it.
In our scenario, we decided to better have the fresh info than quick, but as I said before, our waiting period wasn't that long.
Hope it helps

Codeigniter Web Page Caching and Database Caching both?

I am still new into Codeigniter framework. Today I read about Database Caching http://codeigniter.com/user_guide/database/caching.html and Web Page Caching http://codeigniter.com/user_guide/general/caching.html.
I am a bit confused if database caching makes any big sense once page view is already in cache. So if the page is in cache, it won't go to database anyway.
The only point I see in the following scenario:
If I load 30 results from db, then use php to shuffle results and pull from array 10 results. Next time when page cache is deleted, I will still have 30 results from db in cache, but this time there will be different results after shuffle those 30 results.
Am I missing something, is there any other scenario when having database cache would bring any benefit when using also page caching?
Database caching can benefit you also when using page caching. If your page is generated by several database queries, where some data is constant while the other changes frequently.
In this case you will want to set the page caching to a short time period and retrieve the new data from the database each time while using the same constant data without querying the database.
Example: lets say your frequent data needs to be refreshed every 5 minutes while the constant data changes every 24 hours. In this case you will set the page caching to 5 minutes. Over a period of 24 hours you have queried the database 288 times for the frequent data but have queried for the constant data only once. It totals to 289 queries instead of 576 if you haven't used database caching.

Resources