I am trying to get a cache working in my plugin.
In ext_localconf.php
if (!is_array($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension'] = [];}
if (!isset($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['frontend'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['frontend'] = 'TYPO3\\CMS\\Core\\Cache\\Frontend\\StringFrontend';}
if (!isset($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['options'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['options'] = ['defaultLifetime' => 0];}
if (!isset($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['groups'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['groups'] = ['pages'];}
In my controller action :
$cacheIdentifier = 'topic' . $topic->getUid();
$cache = GeneralUtility::makeInstance(CacheManager::class)->getCache('myextension');
$result = unserialize($cache->get($cacheIdentifier));
if ($result !== false ) {
DebuggerUtility::var_dump($result);
} else {
$result = $this->postRepository->findByTopic($topic->getUid(), $page, $itemsPerPage);
$cache->set($cacheIdentifier, serialize($result), ['tag1', 'tag2']);
DebuggerUtility::var_dump($result);
}
The first time the page with the action gets loaded all is ok and the entry has been made in de database (cf_myextension and cf_myextension_tags}.
But the 2nd time the cache gets loaded and I get an error. Even DebuggerUtility::var_dump($result); does not work:
Call to a member function map() on null
in ../typo3/sysext/extbase/Classes/Persistence/Generic/QueryResult.php line 96
*/
protected function initialize()
{
if (!is_array($this->queryResult)) {
$this->queryResult = $this->dataMapper->map($this->query->getType(), $this->persistenceManager->getObjectDataByQuery($this->query));
}
}
/**
A normal var_dump works and spits out the cache entry. What is the problem? Do I forget something? Can't a QueryResult together with some other variables not be stored as an array in the cache? I also tried VariableFrontend cache, which produced the same error.
The E-tools GUI application indents and pretty formats HTML, JavaScript, JSON and SQL. To install the e-tools snap package in all currently supported versions of Ubuntu open the terminal and type:
sudo snap install e-tools
Related
if caching is turned off then everything works correctly. if you enable caching, the plugin at the beginning works correctly, but after 5 minutes it stops processing the line. normal or progressive caching — it doesn’t matter. when you delete the cache — processing is turned on again, and again after 5 minutes it disappears.
here is the complete plugin code. what could be the reason?
code inserted into the material for example such {robokassa 5}
class plgContentRobokassa extends JPlugin
{
public $cont='';
public function onContentPrepare($context, &$row, &$params, $page = 0)
{
$doc = JFactory::getDocument();
$doc->addStyleSheet(JURI::root(true).'/plugins/content/robokassa/css/robokassa.css');
$this->cont=$context;
}
public function onAfterRender()
{
$is_test='0';
$mrh_pass1='*****';
$mrh_login='******';
$app = JFactory::getApplication();
if ($app->getName() != 'site') {
return true;
}
// Получаем кодовое слово из параметров
$varname = 'robokassa';
//Получаем тело сайта
$html = $app->getBody();
// Если тегов нет
if (strpos($html, $varname) === false)
{
return true;
}
$bodyPos = stripos($html, '<body');
$preContent = '';
if ($bodyPos > -1)
{
$preContent = substr($html, 0, $bodyPos);
$html = substr($html, $bodyPos);
}
//Задаем шаблон поиска
$pattern = '#\{' . $varname . ' ([0-9]+)\}#i';
//Закидываем все найденные шаблоны в массив
if (preg_match_all($pattern, $html, $matches))
{
$db = JFactory::getDbo();
$query = $db->getQuery(true);
foreach ($matches[0] as $i => $match)
{
*replace code here*
}
$html=$preContent.$html.$script_alert;
//Запихиваем всё обратно в тело
$app->setBody($html);
}
}
}
The plugin events are invoked in the order they have in the Plugin administration.
Cache is part of the System plugins so when their cache is valid the response is retrieved from Cache, and the plugins which come before it are not executed.
You could be able to solve this by moving your plugin after Cache (or last)
If the plugin is of a different kind, i.e. Content, there is no way to achieve this.
You can change it into a System plugin (all events from any plugin category is available also in the System plugins).
Another alternative, you could put your code in a module, prevent caching in the module; but this will not work with page cache.
Finally, and I don't know why I didn't write it first, make an Ajax call to a page which you will NOT cache (by excluding it in the page cache plugin configuration: that way the page will be cached and each time it will retrieve the current data.
I am facing a problem. I have installed Vodes 1.5 in website. In admin panel, when I click on the save button I get the error "
Fatal error: Call to a member function loadByOption() on a non-object in /var/www/Joomla/administrator/components/com_vodes/models/config.php on line 58 ".
This is the code we are using :
function save(){
// initialize variables.
$table = JTable::getInstance('component');
$params = JRequest::getVar('params', array(), 'post', 'array');
$row = array();
$row['option'] = 'com_vodes';
$row['params'] = $params;
// load the component data for com_ajaxregister
if (!$table->loadByOption('com_vodes')) {
$this->setError($table->getError());
return false;
}
// bind the new values
$table->bind($row);
// check the row.
if (!$table->check()) {
$this->setError($table->getError());
return false;
}
// store the row.
if (!$table->store()) {
$this->setError($table->getError());
return false;
}
return true;
}
Please help to sought it out.
Just had a look at the developer site for Vodes. The extension is only compatible with Joomla 1.0 and 1.5 with a possibility or running on Joomla 2.5, but most definitely not Joomla 3.x
There are extreme coding differences between Joomla 1.5 and 3.x therefore extensions will throw errors all over the place, or worse, not work at all.
You will have to find an alternative to Vodes or make the whole extensions Joomla 3.x compatible which I can assure you, will not be a 5 minute job
After integrating fine-uploader script into a new code igniter installation, I have the following problem: IE 9 returns false when using the "if (!$this->input->is_ajax_request())" command.
class Welcome extends CI_Controller {
public function index()
{
$this->load->view('upload');
}
public function upload()
{
error_reporting(E_ALL | E_STRICT);
if (!$this->input->is_ajax_request())
{
die('No direct script access allowed');
}
$this->load->helper("qqFileUploader.class");
$uploader = new qqFileUploader('uploads');
// Specify the list of valid extensions, ex. array("jpeg", "xml", "bmp")
$uploader->allowedExtensions = array();
// Specify max file size in bytes.
$uploader->sizeLimit = 900 * 1024 * 1024; // 900 Megabytes
// Specify the input name set in the javascript.
$uploader->inputName = 'qqfile';
$uploader->prefix = 'test_';
// If you want to use resume feature for uploader, specify the folder to save parts.
$uploader->chunksFolder = 'chunks';
$result = $uploader->handleUpload('uploads');
$result['uploadName'] = $uploader->getUploadName();
header("Content-Type: text/plain");
echo json_encode($result);
}
}
Why?
This is completely expected. Fine Uploader does not use XHR to upload files in IE 9 and older (since this is not possible). Instead, we build a form and submit it, targeting an iframe.
I am trying to get the new PDO driver running in Code Igniter 2.1.1 in (to start with) the local (Mac OS 10.7) copy of my app.
I initially coded it using Active Record for all db operations, and I am now thinking I want to use PDO prepared statements in my model files, going forward.
I modified 'application/config/database.php' like so:
(note a couple minor embedded questions)
[snip]
$active_group = 'local_dev';
$active_record = TRUE;//<---BTW, will this need to stay TRUE to make CI sessions work? For better security, don't we want db-based CI sessions to use PDO too?
//http://codeigniter.com/user_guide/database/configuration.html:
//Note: that some CodeIgniter classes such as Sessions require Active Records be enabled to access certain functionality.
//this is the config setting that I am guessing (?) is my main problem:
$db['local_dev']['hostname'] = 'localhost:/tmp/mysql.sock';
// 1.) if $db['local_dev']['dbdriver']='mysql', then here ^^^ 'localhost:/tmp/mysql.sock' works, 2.) but if $db['local_dev']['dbdriver']='pdo', then it fails with error msg. shown below.
$db['local_dev']['username'] = 'root';
$db['local_dev']['password'] = '';
$db['local_dev']['database'] = 'mydbname';
$db['local_dev']['dbdriver'] = 'pdo';
$db['local_dev']['dbprefix'] = '';
$db['local_dev']['pconnect'] = TRUE;
$db['local_dev']['db_debug'] = TRUE;//TRUE
$db['local_dev']['cache_on'] = FALSE;
$db['local_dev']['cachedir'] = '';
$db['local_dev']['char_set'] = 'utf8';
$db['local_dev']['dbcollat'] = 'utf8_general_ci';
$db['local_dev']['swap_pre'] = '';
$db['local_dev']['autoinit'] = TRUE;
$db['local_dev']['stricton'] = FALSE;
[snip]
With the above config., as soon as I load a controller, I get this error message:
Fatal error: Uncaught exception 'PDOException' with message 'could not find driver' in
/Library/WebServer/Documents/system/database/drivers/pdo/pdo_driver.php:114 Stack trace: #0
/Library/WebServer/Documents/system/database/drivers/pdo/pdo_driver.php(114): PDO->__construct('localhost:/tmp/...', 'root', '', Array) #1 /Library/WebServer/Documents/system/database/DB_driver.php(115): CI_DB_pdo_driver->db_pconnect() #2
/Library/WebServer/Documents/system/database/DB.php(148): CI_DB_driver->initialize() #3
/Library/WebServer/Documents/system/core/Loader.php(346): DB('', NULL) #4
/Library/WebServer/Documents/system/core/Loader.php(1171): CI_Loader->database() #5
/Library/WebServer/Documents/system/core/Loader.php(152): CI_Loader->_ci_autoloader() #6
/Library/WebServer/Documents/system/core/Con in
/Library/WebServer/Documents/system/database/drivers/pdo/pdo_driver.php on line 114
I tried swapping out the 'pdo_driver.php' file from the one on github, as per this:
http://codeigniter.com/forums/viewthread/206124/
...but that just generates other errors, not to mention is disturbing to a newbie who does not want to touch the system files if at all possible.
This thread also seems to imply the need to be hacking the 'pdo_driver.php' system file:
CodeIgniter PDO database driver not working
It seems odd to me, though, that (someone thought that) a hack to a system file is needed to make PDO work in CI v.2.1.1, huh?
Thanks for any suggestions I can try.
I don't know if this might be helpful for you since you already started using the CI functions, but I made my own library for PDO with sqlite and just auto load it. My needs were simple, so it serves its purpose.
<?php if ( ! defined('BASEPATH')) exit('No direct script access allowed');
/**
* CodeIgniter PDO Library
*
*
* #author Michael Cruz
* #version 1.0
*/
class Sqlite_pdo
{
var $DB;
public function connect($path) {
try {
$this->DB = new PDO('sqlite:' . $path);
}
catch(PDOException $e) {
print "Error: " . $e->getMessage();
die();
}
}
public function simple_query($SQL) {
$results = $this->DB->query($SQL)
or die('SQL Error: ' . print_r($this->DB->errorInfo()));
return $results;
}
public function prepared_query($SQL, $bind = array()) {
$q = $this->DB->prepare($SQL)
or die('Prepare Error: ' . print_r($this->DB->errorInfo()));
$q->execute($bind)
or die('Execute Error: ' . print_r($this->DB->errorInfo()));
$q->setFetchMode(PDO::FETCH_BOTH);
return $q;
}
public function my_prepare($SQL) {
$q = $this->DB->prepare($SQL)
or die('Error: ' . print_r($this->DB->errorInfo()));
return $q;
}
public function my_execute($q, $bind) {
$q->execute($bind)
or die('Error: ' . print_r($this->DB->errorInfo()));
$q->setFetchMode(PDO::FETCH_BOTH);
return $q;
}
public function last_insert_id() {
return $this->DB->lastInsertId();
}
}
/* End of file Sqlite_pdo.php */
thanks to the noob thread http://codeigniter.com/forums/viewthread/180277/ (InsiteFX’s answer)..
I figured out the below seems to work (need to test more to be 100%... but at least the error messages are gone:
$db['local_dev']['hostname'] = 'mysql:host=127.0.0.1';
EDIT
I tried debugging this with xdebug and netbeans. It's weird that the exports will work during the debug session if I put in some breakpoints. However, with no break points, a more realistic environment, the exports don't work.
I've tried adding sleeps into some parts of the code.
I think that maybe PHP is ending before the Redis commit is completed. Maybe the Redis connections are being done asynchronously, but I checked PRedis and the default is a synchronous connection.
I am working on a reporting tool.
Here is the basic issue.
We store a report into the session object but on later requests when we try to get to the report in the session object it's gone.
Here is a more detailed version.
I store a 'report' object into the session like so
$_SESSION['report_name_unixtimestamp'] = gzcompress( serialize( $reportObject ) );
The user sees the report in some table form and then if they want they can export it. The report could change so the idea behind storing it in the session like this is that when the user exports it to PDF, Excel, etc, they'll be getting a report identical to the one they are viewing.
The user clicks on an export button and on the PHP side it will go into the session, fetch the report via the key provided as a get parameter (uncompresses and unserializes it), create the export and send it to the user for download.
This has worked well up until the point that we tried to introduce the Redis caching server as a tool for better session management.
What happens now is the following:
The first time we run the report it will get stored into the cache and the export will work successfully.
We will run the report again, with the same user account in the same session. This changes the unixtimestamp and so there should be two entries in the $_SESSION. ( $_SESSION['report_name_oldertimetamp'] and $_SESSION['report_name_newertimestamp'] ). When we click on the export button again we get an error saying that the file doesn't exist ( because it hasn't been sent by the server ).
If we check the redis server for the newer version of the report it isn't there, but the old timestamp is still there.
Now, this worked with the file session management but not with Redis. we've tried the redis module for php as well as the pure php client Predis.
Does anyone have any ideas?
Here are a few more details :
Redis has NOT run out of memory. We've checked this many times.
We already know that to unserialize the report object in the session the report class has to be included already. ( remember, the first export works fine but anything after that fails )
If we check the php session object during the request that the report is running on, it WILL contain the newer report but it never makes it to Redis.
Below is the save handler that is being used with Predis.
The redis_session_init is the function I call right before session_start() so that it gets registered. I'm not sure how the redis_session_write function works though so maybe someone can help me with that.
<?php
namespace RedisSession
{
$redisTargetPrefix = "PHPREDIS_SESSION:";
$unpackItems = array( );
$redisServer = "tcp://cache.emcweb.com";
function redis_session_init( $unpack = null, $server = null, $prefix = null )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
if( $unpack !== null )
{
$unpackItems = $unpack;
}
if( $server !== null )
{
$redisServer = $server;
}
if( $prefix !== null )
{
$redisTargetPrefix = $prefix;
}
session_set_save_handler( 'RedisSession\redis_session_open', 'RedisSession\redis_session_close', 'RedisSession\redis_session_read', 'RedisSession\redis_session_write', 'RedisSession\redis_session_destroy', 'RedisSession\redis_session_gc' );
}
function redis_session_read( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
return base64_decode( $redisConnection->get( $redisTargetPrefix . $id ) );
}
function redis_session_write( $id, $data )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$ttl = ini_get( "session.gc_maxlifetime" );
$redisConnection->pipeline( function ($r) use (&$id, &$data, &$redisTargetPrefix, &$ttl, &$unpackItems)
{
$r->setex( $redisTargetPrefix . $id, $ttl, base64_encode( $data ) );
foreach( $unpackItems as $item )
{
$keyname = $redisTargetPrefix . $id . ":" . $item;
if( isset( $_SESSION[ $item ] ) )
{
$r->setex( $keyname, $ttl, $_SESSION[ $item ] );
}
else
{
$r->del( $keyname );
}
}
} );
}
function redis_session_destroy( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$redisConnection->del( $redisTargetPrefix . $id );
$unpacked = $redisConnection->keys( $redisTargetPrefix . $id . ":*" );
foreach( $unpacked as $unp )
{
$redisConnection->del( $unp );
}
}
// These functions are all noops for various reasons... opening has no practical meaning in
// terms of non-shared Redis connections, the same for closing. Garbage collection is handled by
// Redis anyway.
function redis_session_open( $path, $name )
{
}
function redis_session_close()
{
}
function redis_session_gc( $age )
{
}
}
The issue was solved and it was much dumber than I thought.
The save handler doesn't implement locking in any way. On the report pages there are multiple requests being made to the server via ajax and the like. One of the ajax requests starts before the report gets saved to session space. Thus, it reads the session, then writes the session at the end.
Since the reports executes faster every time, the report would get cached to the session in Redis but would then be overwritten by the other script that had an older version of the sessien.
I had help from one of my co-workers. Ugh! This was a headache I'm glad to be over.