I am using an API. The resquest-response time on the API takes too long time. So, due to this problem, I want to store the response on the Server but not on the browser. This is because, I want to display the result from cache for similar searches. I know, there are limitations on the accuracy on cached data, but that's another part which I don't want to include here.
I am using laravel framework and using this code for current moment as in the laravel documentation.
$expiresAt = Carbon::now()->addMinutes(10);
Cache::put('key', 'value', $expiresAt);
The problem with this code is, it stores cache on the browser only. But I want to store it on Server. I have heard about the Memcached but could not implement it. I have also heard of apc_store() but I think it stores on local. So, How can I store cache on server?
Similar to Cache::put(), you can use Cache::pull() to check for data saved (on the server).
// Check cache for data
$cachedData = Cache::pull($key);
// Get new data if no cache
if (! $cachedData) {
$newData = 'New Data';
$expiresAt = Carbon::now()->addMinutes(10);
// Save new data to cache
$cachedData = Cache::put($key, $newData, $expiresAt);
}
echo $cachedData;
Related
I am currently working on performance improvements for a React-based SPA. Most of the more basic stuff is already done so I started looking into more advanced stuff such as service workers.
The app makes quite a lot of requests on each page (most of the calls are not to REST endpoints but to an endpoint that basically makes different SQL queries to the database, hence the amount of calls). The data in the DB is not updated too often so we have a local cache for the responses, but it's obviously getting lost when a user refreshes a page. This is where I wanted to use the service worker - to keep the responses either in cache store or in IndexedDB (I went with the second option). And, of course, the cache-first approach does not fit here too well as there is still a chance that the data may become stale. So I tried to implement the stale-while-revalidate strategy: fetch the data once, then if the response for a given request is already in cache, return it, but make a real request and update the cache just in case.
I tried the approach from Jake Archibald's offline cookbook but it seems like the app is still waiting for real requests to resolve even when there is a cache entry to return from (I see those responses in Network tab).
Basically the sequence seems to be the following: request > cache entry found! > need to update the cache > only then show the data. Doing the update immediately is unnecessary in my case so I was wondering if there is any way to delay that? Or, alternatively, not to wait for the "real" response to be resolved?
Here's the code that I currently have (serializeRequest, cachePut and cacheMatch are helper functions that I have to communicate with IndexedDB):
self.addEventListener('fetch', (event) => {
// some checks to get out of the event handler if certain conditions don't match...
event.respondWith(
serializeRequest(request).then((serializedRequest) => {
return cacheMatch(serializedRequest, db.post_cache).then((response) => {
const fetchPromise = fetch(request).then((networkResponse) => {
cachePut(serializedRequest, response.clone(), db.post_cache);
return networkResponse;
});
return response || fetchPromise;
});
})
);
})
Thanks in advance!
EDIT: Can this be due to the fact that I put stuff into IndexedDB instead of cache? I am sort of forced to use IndexedDB instead of the cache because those "magic endpoints" are POST instead of GET (because of the fact they require the body) and POST cannot be inserted into the cache...
I use cache DatabaseStore for managing mutex files with Laravel scheduler.
I modified app/Console/Kernel.php a bit to make this work.
protected function defineConsoleSchedule()
{
$this->app->bind(SchedulingMutex::class, function() {
$container = Container::getInstance();
$mutex = $container->make(CacheSchedulingMutex::class);
return $mutex->useStore('database');
});
parent::defineConsoleSchedule();
}
I need it to be able to run scheduler on multiple servers, but it requires to have a shared storage. Since I have different Redis instances for all servers I decided to use database cache storage which is provided out of the box.
All works fine, but the db table named cache, where all things are stored does not get cleaned up even after cache expired.
Here is some tuples from the table:
key value expiration
laravelframework/schedule-015655105069... b:1; 1539126032
laravelframework/schedule-015655105069... b:1; 1539126654
So the first one, has expiration 1539126032 (2018-10-09 23:00:32), current time is 2018-10-10 08:09:45. I expect it to be cleaned up.
The question is - should I implement something to maintain this table or Laravel should handle it? What I'm doing wrong if it's Laravel's duty?
I have a simple authentications for user,In UserController I have a fuction called postLogin().
public function postLogin()
{
if(Auth::user()->attempt($credentials))
{
return Redirect::intended('desk')->with('stream',"SomeData");;
}
}
with above code I am able to log in successfullt with the "SomeData" variable which I am retrieving it by
<?php
$class = Session::get('stream');
var_dump($class);
?>
First time when it goes to "/desk" url it dumps the value perfectly fine that is "SomeData" but once I refresh the page it resets the session and the value turns to null.
How do I keep this value till the user logs out.
From the laravel official documentation :
Flash Data
Sometimes you may wish to store items in the session only for the next
request. You may do so using the flash method. Data stored in the
session using this method will only be available during the subsequent
HTTP request, and then will be deleted. Flash data is primarily useful
for short-lived status messages:
$request->session()->flash('status', 'Task was successful!');
If you need to keep your flash data around for even more requests, you
may use the reflash method, which will keep all of the flash data
around for an additional request. If you only need to keep specific
flash data around, you may use the keep method:
$request->session()->reflash();
$request->session()->keep(['username', 'email']);
I have the need to cache a collection of objects that is mostly static (might have changes 1x per day) that is avaliable in my ASP.NET Web API OData service. This result set is used across calls (meaning not client call specific) so it needs to be cached at the application level.
I did a bunch of searching on 'caching in Web API' but all of the results were about 'output caching'. That is not what I'm looking for here. I want to cache a 'People' collection to be reused on subsequent calls (might have a sliding expiration).
My question is, since this is still just ASP.NET, do I use traditional Application caching techniques for persisting this collection in memory, or is there something else I need to do? This collection is not directly returned to the user, but rather used as the source behind the scenes for OData queries via API calls. There is no reason for me to go out to the database on every call to get the exact same information on every call. Expiring it hourly should suffice.
Any one know how to properly cache the data in this scenario?
The solution I ended up using involved MemoryCache in the System.Runtime.Caching namespace. Here is the code that ended up working for caching my collection:
//If the data exists in cache, pull it from there, otherwise make a call to database to get the data
ObjectCache cache = MemoryCache.Default;
var peopleData = cache.Get("PeopleData") as List<People>;
if (peopleData != null)
return peopleData ;
peopleData = GetAllPeople();
CacheItemPolicy policy = new CacheItemPolicy {AbsoluteExpiration = DateTimeOffset.Now.AddMinutes(30)};
cache.Add("PeopleData", peopleData, policy);
return peopleData;
Here is another way I found using Lazy<T> to take into account locking and concurrency. Total credit goes to this post: How to deal with costly building operations using MemoryCache?
private IEnumerable<TEntity> GetFromCache<TEntity>(string key, Func<IEnumerable<TEntity>> valueFactory) where TEntity : class
{
ObjectCache cache = MemoryCache.Default;
var newValue = new Lazy<IEnumerable<TEntity>>(valueFactory);
CacheItemPolicy policy = new CacheItemPolicy { AbsoluteExpiration = DateTimeOffset.Now.AddMinutes(30) };
//The line below returns existing item or adds the new value if it doesn't exist
var value = cache.AddOrGetExisting(key, newValue, policy) as Lazy<IEnumerable<TEntity>>;
return (value ?? newValue).Value; // Lazy<T> handles the locking itself
}
Yes, output caching is not what you are looking for. You can cache the data in memory with MemoryCache for example, http://msdn.microsoft.com/en-us/library/system.runtime.caching.memorycache.aspx . However, you will lose that data if the application pool gets recycled. Another option is to use a distributed cache like AppFabric Cache or MemCache to name a few.
I’m new to Sencha Touch and still not quite confident with its data handling patterns. The way I want to set up my application is something like this:
Retrieve the user’s data from the remote server via AJAX.
Save it in the local storage. Any modifications (editing, adding, deleting items) update the local data.
At some point in time (when the user clicks ‘sync’, when the user logs out, or something like that), the locally stored stored data is synced with the server, again, through an request AJAX.
So what would the basic structure of my application be, to achieve this pattern? And also, while we are here, is there a way to use a local database (as opposed to local key-value storage) for a specified store in Sencha Touch?
First of all Sencha.IO Sync provides the functionality that you're looking for. It's still in beta, but it probably will do exactly what you need and you won't have to host the database yourself:
http://www.sencha.com/products/io
For me I've built apps that use the localstorage proxy to store data locally. It's super easy. Here are a couple of examples of using data storage:
http://www.sencha.com/learn/taking-sencha-touch-apps-offline/
http://data-that.blogspot.com/2011/01/local-storage-proxy-with-sencha-touch.html
http://davehiren.blogspot.com/2011/09/sencha-touch-working-with-models.html
http://www.sencha.com/learn/working-with-forms/
Later in the app I have an AJAX call that will take all of that local data and send it up to the server to generate some reports.
Once you have your stores and models setup correctly it's easy to get the data back out of them. For example I have a contactInfo store that only ever has one entry:
var myContactInfo = contactInfo.first().data;
I have another store called settings, which has many entries. I can easily retrieve them like this (though there may be a better way):
var settingsArr = []
settings.each(function() {
settingsArr.push(this.data);
});
I then can easily send this up to the server like so:
var data = {settings: settingsArr, contactInfo: myContactInfo};
Ext.Ajax.request({
url: 'save.php',
params: {json: Ext.encode(data)},
success: function(response, opts) {
// celebrate
}
});
As with all things a good look at the examples and the API should help you once you have the basics figured out:
http://dev.sencha.com/deploy/touch/docs/?class=Ext.data.Store