Can DBI statement handles use cached calls to execute()? - performance

I have an application where the database rarely changes, and the application requires many reads from the database that is slowing down the performance quite significantly. Many of these reads are exactly the same. So I want to get DBI to cache the results of a database read.
For example,
$sth = $dbh->prepare('SELECT a, b FROM a_table WHERE c = ?');
$sth->execute(5);
$sth->execute(2);
$sth->execute(5); # this call loads the cached result set
I first thought this is what prepare_cached does, but I realised that it only caches the statement handle itself and not actual executions of the statement handle.
I suppose I could achieve what I want by wrapping the statement execution inside a memoized sub. But I'm just seeing if there is a shortcut within DBI itself.

as you said, the prepare_cached is related to the statement handle, and you need to cache the results of the execution. Memoize is good, but probably you need invalidate the cache from time to time, and re-execute the query to get a fresh copy from database.
I'd use the Cache (http://search.cpan.org/perldoc?Cache) module. I've just copied this fragment from the introduction:
use Cache::File;
my $cache = Cache::File->new( cache_root => '/tmp/cacheroot' );
my $customer = $cache->get( $name );
unless ($customer) {
$customer = get_customer_from_db( $name );
$cache->set( $name, $customer, '10 minutes' );
}
return $customer;
You can use in memory cache instead of File. This example uses the $customer value from cache if exists and it's valid, otherwise gets a fresh value and store at cache (with 10 minutes of life).
Hope this helps.

Related

Make a method which generate result for once (first time) and return while it called as it store at the first time

I am new in codeigniter. I have a little bit problem. Like, a function which calls for many times and has a complex query. This function will generate the result for the first time and store the result in a variable . when I will call this function it will return its stored result.
Suppose I have a method like
public function add_sum(){
$a=2;
$b=3;
return c=$a+$b;
}
This function will generate the result for the first time, and when I will call this function after first time.
$a=2;
$b=3;
return c=$a+$b;
this code will not be executed. without this block it will return 5. Hope guys you got my point. Calling function is..
public function test(){
$this->add_sum();
}
Thank you...
Sessions aren't for storing big amounts of data. In fact, depending on your system, it may result in issues. Cookies are similar.
If your query is big but returns something small like as you suggested a number, then it is perfectly fine to store it in a session variable. If it returns something big like a large array, then you could consider using some sort of caching mechanism. Codeigniter has an in-built one for database caching, whose methodology I don't necessarily believe in. But this is generally how you use it if you just want to cache a few queries: https://www.codeigniter.com/user_guide/database/caching.html#function-reference otherwise you can just turn db caching on in database.php (don't do this unless you thoroughly understand the consequences).
You could consider writing/read the results from a file, but you would have to benchmark it to see if it is faster than just performing the SQL operation.
Now to answer your specific question:
public function test(){
if ($this->session->has_userdata('test_response')) {
return $this->session->test_response;
}
$sum = $this->add_sum();
$this->session->set_userdata('test_response', $sum);
return $sum;
}
as you can see, on the first run the conditional statement will eval to false, add_sum will run, and its contents will be assigned to a session variable. on subsequent runs, this variable will be retrieved from the session immediately.

Improve Script performance by caching Spreadsheet values

I am trying to develop a webapp using Google Apps Script to be embedded into a Google Site which simply displays the contents of a Google Sheet and filters it using some simple parameters. For the time being, at least. I may add more features later.
I got a functional app, but found that filtering could often take a while as the client sometimes had to wait up to 5 seconds for a response from the server. I decided that this was most likely due to the fact that I was loading the spreadsheet by ID using the SpreadsheetApp class every time it was called.
I decided to cache the spreadsheet values in my doGet function using the CacheService and retrieve the data from the cache each time instead.
However, for some reason this has meant that what was a 2-dimensional array is now treated as a 1-dimensional array. And, so, when displaying the data in an HTML table, I end up with a single column, with each cell being occupied by a single character.
This is how I have implemented the caching; as far as I can tell from the API reference I am not doing anything wrong:
function doGet() {
CacheService.getScriptCache().put('data', SpreadsheetApp
.openById('####')
.getActiveSheet()
.getDataRange()
.getValues());
return HtmlService
.createTemplateFromFile('index')
.evaluate()
.setSandboxMode(HtmlService.SandboxMode.IFRAME);
}
function getData() {
return CacheService.getScriptCache().get('data');
}
This is my first time developing a proper application using GAS (I have used it in Sheets before). Is there something very obvious I am missing? I didn't see any type restrictions on the CacheService reference page...
CacheService stores Strings, so objects such as your two-dimensional array will be coerced to Strings, which may not meet your needs.
Use the JSON utility to take control of the results.
myCache.put( 'tag', JSON.stringify( myObj ) );
...
var cachedObj = JSON.parse( myCache.get( 'tag' ) );
Cache expires. The put method, without an expirationInSeconds parameter expires in 10 minutes. If you need your data to stay alive for more than 10 minutes, you need to specify an expirationInSeconds, and the maximum is 6 hours. So, if you specifically do NOT need the data to expire, Cache might not be the best use.
You can use Cache for something like controlling how long a user can be logged in.
You could also try using a global variable, which some people would tell you to never use. To declare a global variable, define the variable outside of any function.

How to save a collection for later use in Magento?

I want to write a cronjob in Magento that loads a product collection following certain parameters and saves it somewhere I can use in a cms/page.
My first approach was to use Magento's registry, but that doesn't work, ie a simple
Mage::register('label',$product_collection);
... doesn't work, as it seems "label" is not available in Mage::registry in my PHTML file...
Can someone point me in the right direction? Is this the correct approach? If so, how to make it work; if not, how to do it?
Thanks in advance!
Unfortunately, Mage::register will not get you where you want to go. The Mage registry keys are saved in the memory of the running PHP script, so it is scoped to the page request that is running the PHP code and therefor not shared between cron and your PHTML file.
In order to accomplish what you're looking for, you would need to cache the collection to persistent storage, such as hard-disk or Memcache. You may have to call the load() function specifically before caching, like so:
<?php
// ...
// ... Somewhere in your cron script
$product_collection = Mage::getModel('catalog/product')->getCollection()
->addFieldToFilter('some_field', 'some_value');
$product_collection->load(); // Magento kind of "lazy-loads" its data, so
// without this, you might not save yourself
// from executing MySQL code in the PHTML
// Serialize the data so it can be saved to cache as a string
$cacheData = serialize($product_collection);
$cacheKey = 'some sort of unique cache key';
// Save the serialized collection to the cache (defined in app/etc/local.xml)
Mage::app()->getCacheInstance()->save($cacheData, $cacheKey);
Then, in your PHTML file try:
<?php
// ...
$cacheKey = 'same unique cache key set in the cron script';
// Load the collection from cache
$product_collection = Mage::app()->getCacheInstance()->load($cacheKey);
// I'm not sure if Magento will auto-unserialize your object, so if
// the cache gives us a string, then we will do it ourselves
if ( is_string($product_collection) ) {
$product_collection = unserialize($product_collectoin);
}
// ...
See http://www.magentocommerce.com/boards/viewthread/240836

Doctrine2 Caching of updated Elements

I have a problem with doctrine. I like the caching, but if i update an Entity and flush, shouldn't doctrine2 be able to clear it's cache?
Otherwise the cache is of very little use to me since this project has a lot of interaction and i would literally always have to disable the cache for every query.
The users wouldn't see their interaction if the cache would always show them the old, cached version.
Is there a way arround it?
Are you talking about saving and fetching a new Entity within the same runtime (request)? If so then you need to refresh the entity.
$entity = new Entity();
$em->persist($entity);
$em->flush();
$em->refresh($entity);
If the entity is managed and you make changes, these will be applied to Entity object but only persisted to your database when calling $em->flush().
If your cache is returning an old dataset for a fresh request (despite it being updated successfully in the DB) then it sounds like you've discovered a bug. Which you can file here >> http://www.doctrine-project.org/jira/secure/Dashboard.jspa
Doctrine2 never has those delete methods such as deleteByPrefix, which was in Doctrine1 at some point (3 years ago) and was removed because it caused more trouble.
The page http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/caching.html#deleting is outdated (The next version of the doctrine2 document will see those methods removed). The only thing you can do now is manually managing the cache: find the id and delete it manually after each update.
More advanced doctrine caching is WIP: https://github.com/doctrine/doctrine2/pull/580.
This is according to the documentation on Doctrine2 on how to clear the cache. I'm not even sure this is what you want, but I guess it is something to try.
Doctrine2's cache driver has different levels of deleting cached entries.
You can delete by the direct id, using a regex, by suffix, by prefix and plain deleting all values in the cache
So to delete all you'd do:
$deleted = $cacheDriver->deleteAll();
And to delete by prefix, you'd do:
$deleted = $cacheDriver->deleteByPrefix('users_');
I'm not sure how Doctrine2 names their cache ids though, so you'd have to dig for that.
Information on deleting cache is found here: http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/caching.html#deleting
To get the cache driver, you can do the following. It wasn't described in the docs, so I just traced through the code a little.
I'm assuming you have an entity manager instance in this example:
$config = $em->getConfiguration(); //Get an instance of the configuration
$queryCacheDriver = $config->getQueryCacheImpl(); //Gets Query Cache Driver
$metadataCacheDriver = $config->getMetadataCacheImpl(); //You probably don't need this one unless the schema changed
Alternatively, I guess you could save the cacheDriver instance in some kind of Registry class and retrieve it that way. But depends on your preference. Personally I try not to depend on Registries too much.
Another thing you can do is tell the query you're executing to not use the result cache. Again I don't think this is what you want, but just throwing it out there. Mainly it seems you might as well turn off the query cache altogether. That is unless it's only a few specific queries where you don't want to use the cache.
This example is from the docs: http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/caching.html#result-cache
$query = $em->createQuery('select u from \Entities\User u');
$query->useResultCache(false); //Don't use query cache on this query
$results = $query->getResult();

Codeigniter pre_system hook for DB driven dynamic controller selection - best approach?

Although I can tentatively see a solution to this, I was wondering if there may be a glaringly obvious simpler approach.
My aim is to use the first segment of a given URI to query the DB as to which controller should be run.
I assume I would have to reform the URI with the resultant controller name in segment 1, then allow the system to continue processing as normal (hence a pre_system hook).
Although not essential I would also like to hold a couple of other variables from the same DB request to be used later in the call stack, and assume this would have to be done using global variables?
Any better suggestions would be gladly received.
Thanks.
Should it be of use to anyone else, here is the code to acheive the desired result. This does however not take into account passing additional variables because I can live without them.
function set_controller()
{
include_once APPPATH.'config/database.php'; //Gather the DB connection settings
$link = mysql_connect($db[$active_group]['hostname'], $db[$active_group]['username'], $db[$active_group]['password']) or die('Could not connect to server.' ); //Connect to the DB server
mysql_select_db($db[$active_group]['database'], $link) or die('Could not select database.'); //Select the DB
$URI = explode('/',key($_GET)); //Break apart the URL variable
$query = 'SELECT * FROM theDomainTable WHERE domainName = "'.$URI[1].'"'; //Query the DB with the URI segment
if($results = mysql_fetch_array(mysql_query($query))){ //Only deal with controller requests that exist in the database
$URI[1] = $results['controllerName']; //Replace the controller segment
$_GET = array(implode('/',$URI)=>NULL); //Reconstruct and replace the GET variable
}
mysql_close($link); //Close the DB link
}
I wouldn't use global variables, Id prefer to store it in a library for retrieval later if possible. Global variables are kind of messy in the context of CI.
Although at pre_system Only the benchmark and hooks class have been loaded at this point. This means you're pretty-much stuck with global variables unless you can find a way to select the controller on pre_controller as all the base-classes are loaded and you can put the data somewhere more logical.

Resources