Codeigniter Web Page Caching and Database Caching both? - codeigniter-2

I am still new into Codeigniter framework. Today I read about Database Caching http://codeigniter.com/user_guide/database/caching.html and Web Page Caching http://codeigniter.com/user_guide/general/caching.html.
I am a bit confused if database caching makes any big sense once page view is already in cache. So if the page is in cache, it won't go to database anyway.
The only point I see in the following scenario:
If I load 30 results from db, then use php to shuffle results and pull from array 10 results. Next time when page cache is deleted, I will still have 30 results from db in cache, but this time there will be different results after shuffle those 30 results.
Am I missing something, is there any other scenario when having database cache would bring any benefit when using also page caching?

Database caching can benefit you also when using page caching. If your page is generated by several database queries, where some data is constant while the other changes frequently.
In this case you will want to set the page caching to a short time period and retrieve the new data from the database each time while using the same constant data without querying the database.
Example: lets say your frequent data needs to be refreshed every 5 minutes while the constant data changes every 24 hours. In this case you will set the page caching to 5 minutes. Over a period of 24 hours you have queried the database 288 times for the frequent data but have queried for the constant data only once. It totals to 289 queries instead of 576 if you haven't used database caching.

Related

how to stop ABP from querying EMPTY AbpUserOrganizationUnits TABLE

We use ABP with a MSSQL database hosted on Azure.
in scope of Cost optimization we need to make as less requests to DB as possible.
during investigation i found out that ABP does around 50 millions of requests to table AbpUserOrganizationUnits per month and we don't this table at all
I would like to disable all calls to this table.
making 50 millions requests to database to receive an empty set is not what a normal product should do.
i woudl like to know if there is a way to stop this requests or if redirect them to a redis Cache or even stop them inside the API

Update currency rates in database or file

i'm using laravel 5.7
i need to get the exchange rates and I want to create a scheduler and run every 5 minutes
Now, do you think it's possible to put the currency information in a config file or save it to a database and, of course, i think the cache could be, right?
In your situation the data will be changed only in 5 minutes. So when you get rates just store in the database and then get new values and cache them and give the user rates from cached information. But when the next 5 minutes left and you get newest rates then you must clear existing cache, store new values into database then get new values from database and store into cache and again give the rates to user from new cached content.
If your database getting grow day to day you must add additional logic to not store millions of records into cache every 5 minutes. I advice not to store dynamic content into the cache. Anyway if you want to cache you can use Redis, Memcached etc. But keep in mind to clear and again store new content into cache.
If you only get daily rates in every 5 minutes and not storing anything into database then you can use logic I have told above without database section. Directly save records into cache and clear when new rates getted.
And also add index to database table to get data more faster.

Drupal 7.0 Views 7.x-3.14 Time-based cache is not functioning as expected

I must admit some of this caching stuff is over my head, so this may be a misunderstanding on my part of how this is supposed to function.
But essentially I have a View with the following traits:
pulling six different fields (2 text, 2 date, 1 boolean, 1 image - although that's just a text string too, right?)
four filters with two exposed to users
full pager showing 15 items per page (30+ items was making mysql go away)
sorted by date and sticky
time-based cache is turned on. Settings: query results-never,
rendered output-5 mins
Amount of data being pulled is huge: over 4,700 records
Only other caching solution on the site is stock Drupal page caching for anonymous and blocks, both enabled.
Cron is running every day although I suspect it fails sometimes.
The default filters are supposed to show all events from "time - now" until there are no more future event nodes.
Problem is, the cache sometimes shows "Now" as being yesterday or two days ago.
Shouldn't the cache refresh every five minutes? Am I misunderstanding how this setting works?
Shouldn't the View show up-to-date data even if cron doesn't run, or is cache expiration dependent on the cron running successfully? Or is the stock Drupal page cache overriding the Views cache for anonymous users?
Thanks!

Will Caching be useful when we need multiple items in one go

We are working on a ecom site, where admin can store some configuration on the combination of Product-Category-manufacturer or on Product-Category.
We have some reports, which can return 10000 Product's transactions (with 100-1000 unique combination of product-category-manufacturer ).
In this report, we also need to use configuration as well.
One option could be to fetch configurations from the same stored procedure for all unique Product-Category-manufacturer.
Another option could be to cache all these combination in some outproc cache (like redis). And once transaction data is fetched from stored procedure, system will pull the data from cache for all 1000 Product-Category-Feature combinations. But in this case, we will have to request cache 1000 times and if some of keys are not found in cache, we will have to hit database.
In fact there can be some combination where data does not exist in database. If we request for these combination, system will not find it in cache, and it will have to hit database every-time. To resolve this, we will have to form a set of all the Product-Category-Feature combination where there is data available in cache.
Could anybody suggest that if cache will be useful in this case?
We use caching mainly in 2 occasions,
To Reduce latency: Cache is closer to the client it takes less time for the resource to reach the client.
To Reduce network traffic: Most of the time we see that some resources are reusable but always fetch from original source which
is costly and make more unnecessary traffic. Adding a cache layer
solves this.
So to answer your question, "Will Caching be useful when we need multiple items in one go?" You have to think on the above 2 points. How much you are reusing (cache hit percentage). And cost difference between cache call and call to original source.
If your issue is getting 1000 items at once, Redis don't have issue providing that. It will be so much faster than the transnational DB. And you can have set of all the Product-Category-Feature combinations, its better as we will no have cache misses. However think about the size of the Redis DB, before you proceed.

Caching expensive SQL query in memory or in the database?

Let me start by describing the scenario. I have an MVC 3 application with SQL Server 2008. In one of the pages we display a list of Products that is returned from the database and is UNIQUE per logged in user.
The SQL query (actually a VIEW) used to return the list of products is VERY expensive.
It is based on very complex business requirements which cannot be changed at this stage.
The database schema cannot be changed or redesigned as it is used by other applications.
There are 50k products and 5k users (each user may have access to 1 up to 50k products).
In order to display the Products page for the logged in user we use:
SELECT TOP X * FROM [VIEW] WHERE UserID = #UserId -- where 'X' is the size of the page
The query above returns a maximum of 50 rows (maximum page size). The WHERE clause restricts the number of rows to a maximum of 50k (products that the user has access to).
The page is taking about 5 to 7 seconds to load and that is exactly the time the SQL query above takes to run in SQL.
Problem:
The user goes to the Products page and very likely uses paging, re-sorts the results, goes to the details page, etc and then goes back to the list. And every time it takes 5-7s to display the results.
That is unacceptable, but at the same time the business team has accepted that the first time the Products page is loaded it can take 5-7s. Therefore, we thought about CACHING.
We now have two options to choose from, the most "obvious" one, at least to me, is using .Net Caching (in memory / in proc). (Please note that Distributed Cache is not allowed at the moment for technical constraints with our provider / hosting partner).
But I'm not very comfortable with this. We could end up with lots of products in memory (when there are 50 or 100 users logged in simultaneously) which could cause other issues on the server, like .Net constantly removing cache items to free up space while our code inserts new items.
The SECOND option:
The main problem here is that it is very EXPENSIVE to generate the User x Product x Access view, so we thought we could create a flat table (or in other words a CACHE of all products x users in the database). This table would be exactly the result of the view.
However the results can change at any time if new products are added, user permissions are changed, etc. So we would need to constantly refresh the table (which could take a few seconds) and this started to get a little bit complex.
Similarly, we though we could implement some sort of Cache Provider and, upon request from a user, we would run the original SQL query and select the products from the view (5-7s, acceptable only once) and save that result in a flat table called ProductUserAccessCache in SQL. Next request, we would get the values from this cached-table (as we could easily identify the results were cached for that particular user) with a fast query without calculations in SQL.
Any time a product was added or a permission changed, we would truncate the cached-table and upon a new request the table would be repopulated for the requested user.
It doesn't seem too complex to me, but what we are doing here basically is creating a NEW cache "provider".
Does any one have any experience with this kind of issue?
Would it be better to use .Net Caching (in proc)?
Any suggestions?
We were facing a similar issue some time ago, and we were thinking of using EF caching in order to avoid the delay on retrieving the information. Our problem was a 1 - 2 secs. delay. Here is some info that might help on how to cache a table extending EF. One of the drawbacks of caching is how fresh you need the information to be, so you set your cache expiration accordingly. Depending on that expiration, users might need to wait to get the fresh info more than they would like to, but if your users can accept that they migth be seing outdated info in order to avoid the delay, then the tradeoff would worth it.
In our scenario, we decided to better have the fresh info than quick, but as I said before, our waiting period wasn't that long.
Hope it helps

Resources