How to Clear Magento's Caching of its DB Schema? - caching

I'm developing a Magento extension that introduces a new table to the DB. Whenever I release a new version of the extension that makes modifications to the table schema, I find that users are forced to manually click the "Flush Cache Storage" button under System > Cache Management.
I'd like to have my extension automatically clear the cache upon being installed. I know how to do the same thing as the button programmatically but I'd prefer not to, because this deletes the entire Magento cache folder and impacts negatively on performance.
Might anyone know how to write code that will clear the caching of my table's schema and do so as specifically as possible - leaving unrelated cached data unharmed?
Update: I've found the file containing my table's schema cache here: /var/cache/mage-f/mage---d07_DB_PDO_MYSQL_DDL_<table_name>_1d . Now how do I target it in code? :)

This is what I've been able to come up with:
$app = Mage::app();
if ($app != null)
{
$cache = $app->getCache();
if ($cache != null)
{
$cache->clean('matchingTag', array('DB_PDO_MYSQL_DDL'));
}
}
This will delete only the cache entries and metadata files that hold information about DB schemas.
Note that it will delete these entries for all tables. There's no simple way to clear a specific table's cached schema and leave the rest untouched.

Related

How to check if a database table exists in TYPO3 using doctrine?

Although the TYPO3 core takes good care of having all tables, there might be situations where you need to check if a table exists.
The situation at hand is an Update Wizard which interacts with another extension, where the other extension has a migration changing table names.
So: how to check if a table exists in current TYPO3, thus using doctrine and possibly even multiple database connections
At least for 10LTS, 11LTS and (as of now probably 12LTS too)
return GeneralUtility::makeInstance(ConnectionPool::class)
->getConnectionForTable($tablename)
->getSchemaManager()
->tablesExist([$tablename]);
This works because if no connection for the table is defined because the table doesn't exist, still the default connection is used and a check can be done there.

CachedDbAuthManager Clearing Cache for particular role of user

Using the Auth Manager of Yii I used CachedDbAuthManager. Once SQL executes for specific role against a user it caches the result. Next time records fetched from cache. Now once admin delete the role for a particular user it still remains in cache.
What is solution to this Problem?
Have a look at Yii's Cache Dependency Implementation.
You could eg. invalidate a cache when the admin edits an auth table, see also the database cache dependency. Often this is done just by looking for the latest eg. modified_at time, but this column is not part of the standard auth tables.
From the database cache man page:
CDbCacheDependency represents a dependency based on the query result of a SQL statement.
There is another extension SingleDbAuthManager which is doing nearly the same thing. It reads whole auth tree at once and cache it.
The performance of both SingleDbAuthManager and CachedDbAuthManager is vering. CachedDbAuthManager taking less time but fails to update cache in my case.

Updating Solr Index when product data has changed

We are working on implementing Solr on e-commerce site. The site is continuously updated with a new data, either by updates made in existing product information or add new product altogether.
We are using it on asp.net mvc3 application with solrnet.
We are facing issue with indexing. We are currently doing commit using following:
private static ISolrOperations<ProductSolr> solrWorker;
public void ProductIndex()
{
//Check connection instance invoked or not
if (solrWorker == null)
{
Startup.Init<ProductSolr>("http://localhost:8983/solr/");
solrWorker = ServiceLocator.Current.GetInstance<ISolrOperations<ProductSolr>>();
}
var products = GetProductIdandName();
solrWorker.Add(products);
solrWorker.Commit();
}
Although this is just a simple test application where we have inserted just product name and id into the solr index. Every time it runs, the new products gets updated all at once, and available when we search it. I think this create the new data index into solr everytime it runs? Correct me if I'm wrong.
My Question is:
Does this recreate Solr Index Data in whole? Or just update the data that is changed/new? How? Even if it only updates changed/new data, how it knows what data is changed? With large data set, this must have some issues.
What is the alternative way to track what has changed since last commit, and is there any way to add those product into Solr index that has changed.
What happens when we update existing record into solr? Does it delete old data and insert new and recreate whole index? Is this resource intensive?
How big e-commerce retailer does this with millions of products.
What is the best strategy to solve this problem?
When you do an update only that record is delete and inserted. Solr does not update the records. The other records are untouched. When you commit the data new segments would be created with this new data. On optimize the data is optimized into a single segment.
You can use Incremental build technique to add/update records after the last build. DIH provides it out of the box, If you are handling it manually through jobs you can maintain the timestamp and run builds.
Solr does not have an update operation. It will perform a delete and add. So you have to use the complete data again and not just the updated fields. Its not resource intensive. Usually only Commit and Optimize are.
Solr can handle any amount of data. You can use Sharding if your data grows beyond the handling capacity of a single machine.

Magento index and cache. Do I need both?

I am developig an import module which update product data. To speed up the process, I put index to manual mode.
$processes = Mage::getSingleton('index/indexer')->getProcessesCollection();
$processes->walk('setMode', array(Mage_Index_Model_Process::MODE_MANUAL));
$processes->walk('save');
and after the import is finished, I reindex data and put index mode back to auto
$processes = Mage::getSingleton('index/indexer')->getProcessesCollection();
$processes->walk('reindexAll');
$processes->walk('setMode', array(Mage_Index_Model_Process::MODE_REAL_TIME));
$processes->walk('save');
But I am not sure if I also need to clear cache. So my question is how index and cache are related. For example if I clear cache, does it also reindex all data? And on the other site, if I reindex all data, does it clear cache? Or do I need to trigger everytime both processes if I have index mode set to manual? I am not quite sure about this, I hope anybody could confirm it for sure.
Thank you
Magentos System -> Cache Managment and System -> Index Managment are both stand-alone features. If you rebuild such index, no matter whether thru backend or directly using reindexAll(), Magento will not automatically refresh any cache and vice versa.
The answer to Do I need both? (caches and indexes) is: it depends.
If you're running Magento with caches COLLECTION_DATA and/or EAV enabled, you should refresh those caches after importing and reindexing product data.
The refreshing is necessary because your importer has updated/inserted product data which the caches are not aware of, but not, because you've reindexed.
If you're running Magento with all caches disabled, you don't need both. Technically, there is no need to refresh disabled cache. Magento would be slower of course, but still be fully functional.

Save simple data in Magento's DB w/o Model

I'm looking to save some data in the Magento database without hassling with creating a new EAV object (or even a DB table if I can avoid it). Is there any place that you all know about that Magento will let you store serialized data?
If it matters, the data is a serialized set of SKUs that I need to retrieve. I know that I could create a new model, or possibly even create an attribute as a flag on each product, but those are both really overkill for my purposes.
Thanks,
Joe
First, it's possible to setup a simple, non-EAV model with Magento. You still need to do some configuration and setup, but it's much less complicated than a full on EAV store.
Second, if you're storing information specific to users you can throw it in a session object. I can't recall the syntax right now (will update later), but search through your codebsae for ::getModel followed by the phrase "session".
Third, you still have access to all the old PHP tools you'd normally have. Writing/reading out of a file or memcached space (or bringing in a third party Model library) is another option.
If you just want to run some database queries directly then you can do so with the underlying Zend Db abstractions.

Resources