PHP Redis Session not saving - debugging

EDIT
I tried debugging this with xdebug and netbeans. It's weird that the exports will work during the debug session if I put in some breakpoints. However, with no break points, a more realistic environment, the exports don't work.
I've tried adding sleeps into some parts of the code.
I think that maybe PHP is ending before the Redis commit is completed. Maybe the Redis connections are being done asynchronously, but I checked PRedis and the default is a synchronous connection.
I am working on a reporting tool.
Here is the basic issue.
We store a report into the session object but on later requests when we try to get to the report in the session object it's gone.
Here is a more detailed version.
I store a 'report' object into the session like so
$_SESSION['report_name_unixtimestamp'] = gzcompress( serialize( $reportObject ) );
The user sees the report in some table form and then if they want they can export it. The report could change so the idea behind storing it in the session like this is that when the user exports it to PDF, Excel, etc, they'll be getting a report identical to the one they are viewing.
The user clicks on an export button and on the PHP side it will go into the session, fetch the report via the key provided as a get parameter (uncompresses and unserializes it), create the export and send it to the user for download.
This has worked well up until the point that we tried to introduce the Redis caching server as a tool for better session management.
What happens now is the following:
The first time we run the report it will get stored into the cache and the export will work successfully.
We will run the report again, with the same user account in the same session. This changes the unixtimestamp and so there should be two entries in the $_SESSION. ( $_SESSION['report_name_oldertimetamp'] and $_SESSION['report_name_newertimestamp'] ). When we click on the export button again we get an error saying that the file doesn't exist ( because it hasn't been sent by the server ).
If we check the redis server for the newer version of the report it isn't there, but the old timestamp is still there.
Now, this worked with the file session management but not with Redis. we've tried the redis module for php as well as the pure php client Predis.
Does anyone have any ideas?
Here are a few more details :
Redis has NOT run out of memory. We've checked this many times.
We already know that to unserialize the report object in the session the report class has to be included already. ( remember, the first export works fine but anything after that fails )
If we check the php session object during the request that the report is running on, it WILL contain the newer report but it never makes it to Redis.
Below is the save handler that is being used with Predis.
The redis_session_init is the function I call right before session_start() so that it gets registered. I'm not sure how the redis_session_write function works though so maybe someone can help me with that.
<?php
namespace RedisSession
{
$redisTargetPrefix = "PHPREDIS_SESSION:";
$unpackItems = array( );
$redisServer = "tcp://cache.emcweb.com";
function redis_session_init( $unpack = null, $server = null, $prefix = null )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
if( $unpack !== null )
{
$unpackItems = $unpack;
}
if( $server !== null )
{
$redisServer = $server;
}
if( $prefix !== null )
{
$redisTargetPrefix = $prefix;
}
session_set_save_handler( 'RedisSession\redis_session_open', 'RedisSession\redis_session_close', 'RedisSession\redis_session_read', 'RedisSession\redis_session_write', 'RedisSession\redis_session_destroy', 'RedisSession\redis_session_gc' );
}
function redis_session_read( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
return base64_decode( $redisConnection->get( $redisTargetPrefix . $id ) );
}
function redis_session_write( $id, $data )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$ttl = ini_get( "session.gc_maxlifetime" );
$redisConnection->pipeline( function ($r) use (&$id, &$data, &$redisTargetPrefix, &$ttl, &$unpackItems)
{
$r->setex( $redisTargetPrefix . $id, $ttl, base64_encode( $data ) );
foreach( $unpackItems as $item )
{
$keyname = $redisTargetPrefix . $id . ":" . $item;
if( isset( $_SESSION[ $item ] ) )
{
$r->setex( $keyname, $ttl, $_SESSION[ $item ] );
}
else
{
$r->del( $keyname );
}
}
} );
}
function redis_session_destroy( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$redisConnection->del( $redisTargetPrefix . $id );
$unpacked = $redisConnection->keys( $redisTargetPrefix . $id . ":*" );
foreach( $unpacked as $unp )
{
$redisConnection->del( $unp );
}
}
// These functions are all noops for various reasons... opening has no practical meaning in
// terms of non-shared Redis connections, the same for closing. Garbage collection is handled by
// Redis anyway.
function redis_session_open( $path, $name )
{
}
function redis_session_close()
{
}
function redis_session_gc( $age )
{
}
}

The issue was solved and it was much dumber than I thought.
The save handler doesn't implement locking in any way. On the report pages there are multiple requests being made to the server via ajax and the like. One of the ajax requests starts before the report gets saved to session space. Thus, it reads the session, then writes the session at the end.
Since the reports executes faster every time, the report would get cached to the session in Redis but would then be overwritten by the other script that had an older version of the sessien.
I had help from one of my co-workers. Ugh! This was a headache I'm glad to be over.

Related

TYPO3 9.5 Extbase plugin cache implementation

I am trying to get a cache working in my plugin.
In ext_localconf.php
if (!is_array($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension'] = [];}
if (!isset($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['frontend'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['frontend'] = 'TYPO3\\CMS\\Core\\Cache\\Frontend\\StringFrontend';}
if (!isset($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['options'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['options'] = ['defaultLifetime' => 0];}
if (!isset($GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['groups'])) {
$GLOBALS['TYPO3_CONF_VARS']['SYS']['caching']['cacheConfigurations']['myextension']['groups'] = ['pages'];}
In my controller action :
$cacheIdentifier = 'topic' . $topic->getUid();
$cache = GeneralUtility::makeInstance(CacheManager::class)->getCache('myextension');
$result = unserialize($cache->get($cacheIdentifier));
if ($result !== false ) {
DebuggerUtility::var_dump($result);
} else {
$result = $this->postRepository->findByTopic($topic->getUid(), $page, $itemsPerPage);
$cache->set($cacheIdentifier, serialize($result), ['tag1', 'tag2']);
DebuggerUtility::var_dump($result);
}
The first time the page with the action gets loaded all is ok and the entry has been made in de database (cf_myextension and cf_myextension_tags}.
But the 2nd time the cache gets loaded and I get an error. Even DebuggerUtility::var_dump($result); does not work:
Call to a member function map() on null
in ../typo3/sysext/extbase/Classes/Persistence/Generic/QueryResult.php line 96
*/
protected function initialize()
{
if (!is_array($this->queryResult)) {
$this->queryResult = $this->dataMapper->map($this->query->getType(), $this->persistenceManager->getObjectDataByQuery($this->query));
}
}
/**
A normal var_dump works and spits out the cache entry. What is the problem? Do I forget something? Can't a QueryResult together with some other variables not be stored as an array in the cache? I also tried VariableFrontend cache, which produced the same error.
The E-tools GUI application indents and pretty formats HTML, JavaScript, JSON and SQL. To install the e-tools snap package in all currently supported versions of Ubuntu open the terminal and type:
sudo snap install e-tools

Laravel Collective SSH results

I am performing SSH in Laravel whereby I connect to another server and download a file. I am using Laravel Collective https://laravelcollective.com/docs/5.4/ssh
So, the suggested way to do this is something like this
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
if($result) {
return $path;
} else {
return 401;
}
Now that successfully downloads the file and moves it to my local server. However, I am always returned 401 because $result seems to be Null.
I cant find much or getting the result back from the SSH. I have also tried
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path, function($line){
dd( $line.PHP_EOL);
});
But that never gets into the inner function.
Is there any way I can get the result back from the SSH? I just want to handle it properly if there is an error.
Thanks
Rather than rely on $result to give you true / false / error, you can check if the file was downloaded successfully in another way:
// download the file
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
// see if downloaded file exists
if ( file_exists($path) ) {
return $path;
} else {
return 401;
}
u need to pass file name also like this in get and put method:
$fileName = "example.txt";
$get = \SSH::into('scripts')->get('/remote/somelocation/'.$fileName, base_path($fileName));
in set method
$set = \SSH::into('scripts')->set(base_path($fileName),'/remote/location/'.$fileName);
in list
$command = SSH::into('scripts')->run(['ls -lsa'],function($output) {
dd($output);
});

Laravel session suddenly not being saved

Yesterday it was working fine, but somehow today it's not saving the array to the session.
if(Session::get('header')) {
$array['header'] = Session::get('header');
} else {
$header = $this->loadElements(2, false);
$array['header'] = $header;
Session::put('header', $header);
}
When I look in the storage\framework\sessions folder, the header is in the latest session file.
But what looks very weird to me, is that when having the browser window still open, on each page reload a new session file is created, which is very odd to me.
I did some further testing and it has nothing to do with my code, but all with the recreation of sessions after each page load. In another project it works fine, so it must have to do with the recreation.
Why is Laravel all of a sudden not reading the session file and thus recreating it?
In config\session.php I changed 'driver' => env('SESSION_DRIVER', 'file') to database and back, and then it works again.
Smh
Rewrite your code to use Session::has in your condition rather than get.
if(Session::has('header')) {
$array['header'] = Session::get('header');
} else {
$header = $this->loadElements(2, false);
$array['header'] = $header;
Session::put('header', $header);
}

Writing Logs from Console Shell

I have been using CakePHP 2.4.4 to build the interactive web part of my app and that is going very well. CakePHP is awesome.
I am now doing some supporting background processes. The Console and Shell seems to be the way to do it as it has access to the Models.
I have written the code and have it working but I am trying to write to the same log that I use for the Models. In the models I have an afterSave function to log all the database changes and I just used the $this->log("$model $logEntry", 'info'); to write to the log.
That doesn't work in the Shell but I thought the CakeLog::write('info', "$model $logEntry"); might work but it doesn't either.
Do I need to initialise the CakeLog to point to the correct log files?
<?php
App::uses('CakeTime', 'Utility');
App::uses('CakeLog', 'Utility');
class ProcessRequestShell extends AppShell {
//Need to access the request and monitor tables
public $uses = array('Request');
private function updateRequest($data){
$model = 'Request';
$result = $this->Request->save($data);
$logEntry = "UPDATE ProcessRequestShell ";
foreach ($data[$model] AS $k => $v){$logEntry .= "$k='$v' ";}
if ($result){
//$this->log("$model $logEntry", 'info');
CakeLog::write('info', "$model $logEntry");
} else {
//$this->log("$model FAILED $logEntry", 'error');
CakeLog::write('error', "$model FAILED $logEntry");
}
return($result);
}
public function main() {
$options = array('conditions' => array('state' => 0, 'next_state' => 1));
$this->Request->recursive = 0;
$requests = $this->Request->find('all', $options);
//See if the apply_changes_on date/time is past
foreach ($requests AS $request){
$this->out("Updating request ".$request['Request']['id'], 1, Shell::NORMAL);
//Update the next_state to "ready"
$request['Request']['state'] = 1;
$request['Request']['next_state'] = 2;
$this->updateRequest($request);
}
}
}
?>
Did you set up a default listener/scope for those log types?
If not, they will not get logged.
// Setup a 'default' cache configuration for use in the application.
Cache::config('default', array('engine' => 'File'));
In your bootstrap.php for example
See http://book.cakephp.org/2.0/en/appendices/2-2-migration-guide.html#log
You need to setup default log stream writing to file, eventually, in app/Config/bootstrap.php.
CakeLog does not auto-configure itself anymore. As a result log files
will not be auto-created anymore if no stream is listening. Make sure
you got at least one default stream set up, if you want to listen to
all types and levels. Usually, you can just set the core FileLog class
to output into app/tmp/logs/:
CakeLog::config('default', array(
'engine' => 'File'
));
See Logging → Writing to logs section of the CookBook 2.x

Yii time caching not working?

My version of YII: 1.1.12... scratch that, I upgraded to version 1.1.13, and still doesn't work.
I tried this:
Yii::app()->cache->set('someKey', $auctions);
$data = Yii::app()->cache->get('someKey');
print_r($data);
And I see the data that I stored! However, if I try this:
Yii::app()->cache->set('someKey', $auctions, 10);
$data = Yii::app()->cache->get('someKey');
print_r( $data );
I see nothing? Why is YII ignoring my time interval? What am I missing?
** EDIT **
My caching is defined in config as:
'cache'=>array(
'class'=>'system.caching.CMemCache',
'useMemcached'=>false,
'servers'=>array(
array( 'host'=>'127.0.0.1', 'port'=> 11211, 'weight'=>60 ),
//array('host'=>'server2', 'port'=>11211, 'weight'=>40),
),
),
I know the Memcache is working, because I've tested it with this example outside of the YII framework:
$memcache = new Memcache;
$memcache->connect("localhost",11211);
$tmp_object = new stdClass;
$tmp_object->str_attr = "test";
$memcache->set("mysupertest",$tmp_object,false,5);
var_dump($memcache->get("mysupertest"));
This works and the item is cached for 5 seconds...
It seems like it's a bug in CMemCache.php. There is this function:
protected function setValue($key,$value,$expire)
{
if($expire>0)
$expire+=time();
else
$expire=0;
return $this->useMemcached ? $this->_cache->set($key,$value,$expire) : $this->_cache->set($key,$value,0,$expire);
}
MemCache does not want the time to be added, so my quick fix is:
protected function setValue($key,$value,$expire)
{
return $this->useMemcached ? $this->_cache->set($key,$value,$expire) : $this->_cache->set($key,$value,0,$expire);
}
Well, make sure $auctions are well defined.
Yii::app()->cache->set('someKey', array('someValue'), 120); // 120 means 2 minutes
print_r(Yii::app()->cache->get('someKey')); // you should see the array with the single value, I do see it when I try to run it
Make sure the config is OK and you are not using CDummyCache. Mine looks like this:
'components' => array(
...
// Add a cache component to store data
// For demo, we are using the CFileCache, you can use any
// type your server is configured for. This is the simplest as it
// requires no configuration or setup on the server.
'cache' => array (
'class' => 'system.caching.CFileCache',
),
...
),

Resources