I am new to codeigniter. I want to use file-based caching.I don't know if I understood correctly.
1. Declare the following in the parent controller --$this->load->driver('cache');
2. $this->cache->file->save('foo', 'bar', 10); is used to save the file but i don't know what are the parameters of this function and how to implement all these so that caching can be done.
Please help
http://codeigniter.com/user_guide/libraries/caching.html#example_usage
The manual has it - but it's a bit hidden in the example:
if ( ! $foo = $this->cache->get('foo'))
{
echo 'Saving to the cache!<br />';
$foo = 'foobarbaz!';
// Save into the cache for 5 minutes
$this->cache->save('foo', $foo, 300);
}
'foo' -> the name for the variable you're about to cache
$foo -> the variable to cache. It can be anything
300 -> time in seconds (60*5) - set to 0 for no expiry
So IF $foo is empty the cache file is recreated, else you can use $foo to load data.
Further notes:
http://codeigniter.com/user_guide/general/caching.html
A more flexible alternative could be this sparks library:
http://getsparks.org/packages/cache/show
I use it and it fits my needs for file-based caching very well.
Related
I am using Laravel 9 with the Redis cache driver. However, I have an issue where the internal standard_ref and forever_ref map that Laravel uses to manage tagged cache exceed more than 10MB.
This map consists of numerous keys, 95% of which have already expired/decayed and no longer exist; this map seems to grow in size and has a TTL of -1 (never expire).
Other than "not using tags", has anyone else encountered and overcome this? I found this in the slow log of Redis Enterprise, which led me to realize this is happening:
I checked the key/s via SCAN and can confirm it's a massive set of cache misses. It seems highly inefficient and expensive to constantly transmit 10MB back and forth to find one key within the map.
This quickly and efficiently removes expired keys from the SET data-type that laravel uses to manage tagged cache.
use Illuminate\Support\Facades\Cache;
function flushExpiredKeysFromSet(string $referenceKey) : void
{
/** #var \Illuminate\Cache\RedisStore $store */
$store = Cache::store()->getStore();
$lua = <<<LUA
local keys = redis.call('SMEMBERS', '%s')
local expired = {}
for i, key in ipairs(keys) do
local ttl = redis.call('ttl', key)
if ttl == -2 or ttl == -1 then
table.insert(expired, key)
end
end
if #expired > 0 then
redis.call('SREM', '%s', unpack(expired))
end
LUA;
$store->connection()->eval(sprintf($lua, $key, $key), 1);
}
To show the calls that this LUA script generates, from the sample above:
10:32:19.392 [0 lua] "SMEMBERS" "63c0176959499233797039:standard_ref{0}"
10:32:19.392 [0 lua] "ttl" "i-dont-expire-for-an-hour"
10:32:19.392 [0 lua] "ttl" "aa9465100adaf4d7d0a1d12c8e4a5b255364442d:i-have-expired{1}"
10:32:19.392 [0 lua] "SREM" "63c0176959499233797039:standard_ref{0}" "aa9465100adaf4d7d0a1d12c8e4a5b255364442d:i-have-expired{1}"
Using a custom cache driver that wraps the RedisTaggedCache class; when cache is added to a tag, I dispatch a job using the above PHP script only once within that period by utilizing a 24-hour cache lock.
Here is how I obtain the reference key that is later passed into the cleanup script.
public function dispatchTidyEvent(mixed $ttl)
{
$referenceKeyType = $ttl === null ? self::REFERENCE_KEY_FOREVER : self::REFERENCE_KEY_STANDARD;
$lock = Cache::lock('tidy:'.$referenceKeyType, 60 * 60 * 24);
// if we were able to get a lock, then dispatch the event
if ($lock->get()) {
foreach (explode('|', $this->tags->getNamespace()) as $segment) {
dispatch(new \App\Events\CacheTidyEvent($this->referenceKey($segment, $referenceKeyType)));
}
}
// otherwise, we'll just let the lock live out its life to prevent repeating this numerous times per day
return true;
}
Remembering that a "cache lock" is simply just a SET/GET and Laravel is responsible for many of those already on every request to manage it's tags, adding a lock to achieve this "once per day" concept only adds negligible overhead.
I'm trying to disable all visitor cookies for my Joomla website.
I found some tutorials, but they are for Joomla version:1.x
Any suggestions?
The solution is very similar to solution to remove cookies in Joomla version 1.x and 2.x. So we will use the same condition and principle.
If you change this two files then maybe something other will not work. Change this only if you know what are you doing and if you know that will everyting else work. Because you can break the whole website!
You must edit two files /libraries/src/Application/CMSApplication.php and libraries/joomla/session/handler/native.php
In libraries/src/Application/CMSApplication.php change code around line 166 and add if condition for whole code in method if (substr($_SERVER['SCRIPT_NAME'] , 0 , 14) == "/administrator"){
public function checkSession()
{
if (substr($_SERVER['SCRIPT_NAME'] , 0 , 14) == "/administrator"){ // added condition
$db = \JFactory::getDbo();
$session = \JFactory::getSession();
$user = \JFactory::getUser();
// ... rest of code
}
}
In libraries/joomla/session/handler/native.php change code around line 229 add if condition for whole code in method like in previous file
private function doSessionStart()
{
if (substr($_SERVER['SCRIPT_NAME'] , 0 , 14) == "/administrator"){ // added condition
// Register our function as shutdown method, so we can manipulate it
register_shutdown_function(array($this, 'save'));
// ... rest of code
}
}
This works in Joomla 3.8.2
Note: after every Joomla update you must edit this two files again and test if this solution still works.
Set the cookie-path "/administrator" in the Admin Joomla Settings (System => Configuration).
Then the session cookies are created only for the admin area.
To avoid all cookies for normal visitors, you need to follow the below steps.
First of all: Deactivate site statistics! Global configuration -> Statistics -> Statistics: No. This will stop the "mosvisitor" cookie.
Don't use the Template Chooser module, because it uses a cookie named "jos_user_template".
Be careful with components: Some might start their own PHP session.
Now to the main point: comment out line 697 of /includes/joomla.php like this:
// setcookie( $sessionCookieName, '-', false, '/' );
Additional: Comment out line 25 in /offline.php:
// session_start();
This seams to be an artifact of old versions.
I am not sure why this loop is not working.
$orders = Mage::getSingleton('sales/order')->getCollection()
->addAttributeToSelect('*')
->addFieldToFilter('created_at', array('from'=>$from, 'to'=>$to))
->addAttributeToSort('increment_id', 'ASC')
;
foreach ($orders as $item) {
$order_id = $item->increment_id;
if (is_numeric($order_id)) $order = Mage::getModel('sales/order')->loadByIncrementId($order_id);
if (is_object($order)) {
echo "> O: ". $order_id ."<BR>";
$items = $order->getAllItems();
echo ">> O: ". $order_id ."<BR>";
} else
die("DIE ". var_dump($order));
}
die("<BR> DONE");
The output:
...
...
>> O: 100021819
> O: 100021820
>> O: 100021820
> O: 100021821
The loop never finishes nor does it stop at the same order_id.?
It always fails at $order->getAllItems()
These orders are either pending, processing or complete.
Is there something I should be checking for with $order->getAllItems(), since that's were it's failing.
Thanks.
Jon, I assume the problem you're talking about is your script ending un expectedly. i.e., you see the output with a single >
> O: 100021821
but not the output with the double >>.
Because Magento is so customizable, it's impossible to accurately diagnose your problem with the information given. Something is happening in your system, (a PHP error, an uncaught exception, etc.), that results in your script stopping. Turn on developer mode and set the PHP ini display_errors to 1 (ini_set('display_errors', 1);) and check your error log. One you (or we) have the PHP error, it'll be a lot easier to help you.
My guess is you're running into a memory problem. The way PHP has implemented objects can lead to small memory leaks — objects don't clean up after themselves correctly. This means each time you go through the loop you're slowly consuming the total amount of memory that's allowed for a PHP request. For a system with a significant number of orders, I'd be surprised if the above code could get through everything before running out of memory.
If your problem is a memory problem, there's information on manually cleaning up after PHP's objects in this PDF. You should also consider splitting your actions into multiple requests. i.e. The first request handles orders 1 - 100, the next 101 - 200, etc.
What do you mean it fails?
By the look of the output it doesn't fail there as it outputs text either side of the call to getAllItems()
change:
$items = $order->getAllItems();
to:
foreach($order->getAllItems() as $orderItem) {
echo $orderItem->getId() . "<br />";
}
and see what happens.
The script could be ending on a different order ID each time if you have a low memory limit set on the server and it quits when it runs out of resources.
I simply want to have a rest api server which I can call to update a file via a URL, that's it.
Here is the file:
mytextfile:
key1 = value1
key2 = value2
On the client, a script will be run which sends a string or strings to the API server.
The API server will receive them, for example /update.script?string1="blah"&string2="fun" (pretend its url encoded)
The server should then parse these strings, and then call an exec function, or another script even on the system which does some sed command to update a file
Language or implementation doesn't matter.
Looking for fresh ideas.
All suggestions are appreciated.
I don't get it: What exactly is your problem/question?
My approach to the problem "modifying a file from inside a cgi script using url-encoded arguments" would be:
Pick a language you like and start coding, in my case with Perl.
#!/usr/bin/perl
use strict; use warnings;
Fetch all your arguments. I will use the CGI module of Perl here:
use CGI::Carp;
use CGI;
my $cgi = CGI->new;
# assuming we don't have multivalued fields:
my %arguments = $cgi->Values; # handles (almost) *all* decoding and splitting
# validate arguments
# send back CGI header to acknowledge the request
# the server will make a HTTP header from that
Now either call a special subroutine / function with them …
updateHandler(%arguments);
...;
my $filename = 'path to yer file name.txt';
sub updateHandler {
my %arguments = #_;
# open yer file, loop over yer arguments, whatever
# read in file
open my $fileIn, '<', $filename or die "Can't open file for reading";
my #lines = <$fileIn>;
close $fileIn;
# open the file for writing, completely ignoring concurrency issues:
open my $fileOut, '>', $filename or die "Can't open file for writing";
# loop over all lines, make substitutions, and print it out
foreach my $line (#lines) {
# assuming a file format with key-value pairs
# keys start at the first column
# and are seperated from values by an '=',
# surrounded by any number of whitespace characters
my ($key, $value) = split /\s*=\s*/, $line, 2;
$value = $arguments{$key} // $value;
# you might want to make sure $value ends with a newline
print $fileOut $key, " = ", $value;
}
}
Please don't use this rather insecure and suboptimal code! I just wrote this as a demonstration that this isn't really complicated.
… or contrieve a way to send your arguments to another script (although Perl is more than well suited for file manipulation tasks). Choose one of the qw{}, system or exec commands, depending on what output you need from your script, or decide to pipe your arguments to the script using the open my $fh, '|-', $command mode of open.
As for the server to run this script on: Apache looks fine to me, unless you have very special needs (your own protocol, single-threading, low security, low performance) in which case you might want to code your own server. Using the HTTP::Daemon module you might manage <50 lines for a simplicistic server.
When using Apache, I'd strongly suggest using mod_rewrite to put the /path into the PATH_INFO environment variable. When using one script to represent your whole REST API, you could use the PATH_INFO to choose one of many methods/subroutines/functions. This also eliminates the need to name the script in the URL.
For example, turn the URL
http://example.com/rest/modify/filename?key1=value1
into
/cgi-bin/rest-handler.pl/modify/filename?key1=value1
Inside the Perl script, we would then have $ENV{PATH_INFO} containing /modify/filename.
This is a bit Perl-centric, but just pick any language you are comfortable with and start coding, leveraging whatever module you can use on the way.
I would use a newer Perl framework, like Mojolicious. If I make a file (test.pl):
#!/usr/bin/env perl
use Mojolicious::Lite;
use Data::Dumper;
my $file = 'file.txt';
any '/' => sub {
my $self = shift;
my #params = $self->param;
my $data = do $file;
$data->{$_} = $self->param($_) for #params;
open my $fh, '>', $file or die "Cannot open $file";
local $Data::Dumper::Terse = 1;
print $fh Dumper $data;
$self->render( text => "File Updated\n" );
};
app->start;
Then run morbo test.pl
and visit http://localhost:3000/?hello=world (or run ./test.pl get /?hello=world)
then I get in file.txt:
{
'hello' => 'world'
}
and so on.
I am trying to put second language on my webpage. I decided to use different files for different languages told apart by path - language/pl/projects.ln contains Polish text, language/en/projects.ln - English. Those extensions are just to tell language files from other, the content is simple php:
$lang["desc"]["fabrics"]["title"] = "MATERIAŁY";
$lang["desc"]["fabrics"]["short_text"] = "Jakiś tam tekst na temat materiałów";
$lang["desc"]["services"]["title"] = "USŁUGI";
$lang["desc"]["services"]["short_text"] = "Jakiś tam tekst na temat usłóg";
And then on the index page I use it like so:
session_start();
if (isset($_SESSION["lang"])) {
$language = $_SESSION["lang"];
} else {
$language = "pl";
}
include_once("language/$language/projects.ln");
print $lang["desc"]["fabrics"]["title"];
The problem is that if the session variable is not set everything works fine and array item content is displayed but once I change and set $_SESSION["lang"] nothing is displayed. I tested if the include itself works as it should by putting print "sth"; at the beginning of projects.ln file and that works all right both with $_SESSION["lang"] set and unset.
Please help.
Can you test the return value of session_start() - if it's false, it failed to start the session.
Is it being called before you output anything to the browser? If headers were already sent and your error_reporting level is too low, you won't even see the error message.
Stupid, but - do you set value of $_SESSION['lang'] to valid value like "en"? Does the English translation load correctly when you use it as default value in else block instead of "pl"?
"Jakiś tam tekst na temat usłóg" -> "usług" :)
Can you tell us what does this one output:
if(session_start()) {
echo SID, '<br/>';
if(isset($_SESSION['lang'])) {
echo 'lang = "',$_SESSION['lang'], '"';
}
}
Session starts fine and accidentally I managed to fix it.
I renamed $_SESSION['lang'] to $_SESSION['curr_lang'] and it now works allright. It seams like it didn't like the array and session variable having the same name (?).