Best way to optionally cache responses in Laravel? - laravel

I'm building a Laravel API and i was wondering what the best method is for optionally caching responses?
Assuming i have the following UserController in my API:
public function index()
{
return Cache::remember('users', '3600', function() {
return new UserCollection(User::all());
});
}
Now let's say in my app front-end that i want to force a refresh just in case there has been an update to the users within the past hour and i do not want to wait 1 hour for the cache to empty. How can i force a cache refresh and query the users from the database rather than the cache without waiting for the cache to expire?
One thought i had was to pass an additional header and use Middleware to retrieve the uncached resource but i'm not sure on this idea.
Is there a better way of doing this?
Thanks

Related

Is there a way to delay cache revalidation in service worker?

I am currently working on performance improvements for a React-based SPA. Most of the more basic stuff is already done so I started looking into more advanced stuff such as service workers.
The app makes quite a lot of requests on each page (most of the calls are not to REST endpoints but to an endpoint that basically makes different SQL queries to the database, hence the amount of calls). The data in the DB is not updated too often so we have a local cache for the responses, but it's obviously getting lost when a user refreshes a page. This is where I wanted to use the service worker - to keep the responses either in cache store or in IndexedDB (I went with the second option). And, of course, the cache-first approach does not fit here too well as there is still a chance that the data may become stale. So I tried to implement the stale-while-revalidate strategy: fetch the data once, then if the response for a given request is already in cache, return it, but make a real request and update the cache just in case.
I tried the approach from Jake Archibald's offline cookbook but it seems like the app is still waiting for real requests to resolve even when there is a cache entry to return from (I see those responses in Network tab).
Basically the sequence seems to be the following: request > cache entry found! > need to update the cache > only then show the data. Doing the update immediately is unnecessary in my case so I was wondering if there is any way to delay that? Or, alternatively, not to wait for the "real" response to be resolved?
Here's the code that I currently have (serializeRequest, cachePut and cacheMatch are helper functions that I have to communicate with IndexedDB):
self.addEventListener('fetch', (event) => {
// some checks to get out of the event handler if certain conditions don't match...
event.respondWith(
serializeRequest(request).then((serializedRequest) => {
return cacheMatch(serializedRequest, db.post_cache).then((response) => {
const fetchPromise = fetch(request).then((networkResponse) => {
cachePut(serializedRequest, response.clone(), db.post_cache);
return networkResponse;
});
return response || fetchPromise;
});
})
);
})
Thanks in advance!
EDIT: Can this be due to the fact that I put stuff into IndexedDB instead of cache? I am sort of forced to use IndexedDB instead of the cache because those "magic endpoints" are POST instead of GET (because of the fact they require the body) and POST cannot be inserted into the cache...

Laravel cache DatabaseStore clean up

I use cache DatabaseStore for managing mutex files with Laravel scheduler.
I modified app/Console/Kernel.php a bit to make this work.
protected function defineConsoleSchedule()
{
$this->app->bind(SchedulingMutex::class, function() {
$container = Container::getInstance();
$mutex = $container->make(CacheSchedulingMutex::class);
return $mutex->useStore('database');
});
parent::defineConsoleSchedule();
}
I need it to be able to run scheduler on multiple servers, but it requires to have a shared storage. Since I have different Redis instances for all servers I decided to use database cache storage which is provided out of the box.
All works fine, but the db table named cache, where all things are stored does not get cleaned up even after cache expired.
Here is some tuples from the table:
key value expiration
laravelframework/schedule-015655105069... b:1; 1539126032
laravelframework/schedule-015655105069... b:1; 1539126654
So the first one, has expiration 1539126032 (2018-10-09 23:00:32), current time is 2018-10-10 08:09:45. I expect it to be cleaned up.
The question is - should I implement something to maintain this table or Laravel should handle it? What I'm doing wrong if it's Laravel's duty?

Redux Local Storage Workflow

I am using Redux and would like to store some state on local storage.
I only like to store the token which I receive from the server. There are other things in the store that I don't like to store.
The workflow that I found on google is to grab from local storage in initial store map.
Then he uses store.subscribe to update the local storage on regular interval.
That is valid if we are storing the entire store. But for my case, the token is only updated when user logs out or a new user logs in.
I think store.subscribe is an overkill.
I also read that updating local storage in reducers is not the redux way.
Currently, I am updating local storage in action before reducer is updated.
Is it the correct flow or is there a better way?
The example you found was likely about serializing the entire state tree into localstorage with every state change, allowing users to close the tab without worrying about constantly saving since it will always be up to date in LocalStorage.
However, it's clear that this isn't what you are looking for, as you are looking to cache specific priority data in LocalStorage, not the entire state tree.
You are also correct about updating LocalStorage as part of a reducer being an anti-pattern, as all side-effects are supposed to be localized to action creators.
Thus you should be reading from and writing to LocalStorage in your action creators.
For instance, your action creator for retrieving a token could look something like:
const TOKEN_STORAGE_KEY = 'TOKEN';
export function fetchToken() {
// Assuming you are using redux-thunk for async actions
return dispatch => {
const token = localStorage.getItem(TOKEN_STORAGE_KEY);
if (token && isValidToken(token)) {
return dispatch(tokenRetrieved(token));
}
return doSignIn().then(token => {
localStorage.setItem(TOKEN_STORAGE_KEY, token);
dispatch(tokenRetrieved(token));
}
}
}
export function tokenRetrieved(token) {
return {
type: 'token.retrieved',
payload: token
};
}
And then somewhere early on in your application boot, such as in one of your root component's componentWillMount lifecycle methods, you dispatch the fetchToken action.
fetchToken takes care of both checking LocalStorage for a cached token as well as storing a new token there when one is retrieved.

keeping laravel session variable till user logs out

I have a simple authentications for user,In UserController I have a fuction called postLogin().
public function postLogin()
{
if(Auth::user()->attempt($credentials))
{
return Redirect::intended('desk')->with('stream',"SomeData");;
}
}
with above code I am able to log in successfullt with the "SomeData" variable which I am retrieving it by
<?php
$class = Session::get('stream');
var_dump($class);
?>
First time when it goes to "/desk" url it dumps the value perfectly fine that is "SomeData" but once I refresh the page it resets the session and the value turns to null.
How do I keep this value till the user logs out.
From the laravel official documentation :
Flash Data
Sometimes you may wish to store items in the session only for the next
request. You may do so using the flash method. Data stored in the
session using this method will only be available during the subsequent
HTTP request, and then will be deleted. Flash data is primarily useful
for short-lived status messages:
$request->session()->flash('status', 'Task was successful!');
If you need to keep your flash data around for even more requests, you
may use the reflash method, which will keep all of the flash data
around for an additional request. If you only need to keep specific
flash data around, you may use the keep method:
$request->session()->reflash();
$request->session()->keep(['username', 'email']);

store API response data on cache on server php

I am using an API. The resquest-response time on the API takes too long time. So, due to this problem, I want to store the response on the Server but not on the browser. This is because, I want to display the result from cache for similar searches. I know, there are limitations on the accuracy on cached data, but that's another part which I don't want to include here.
I am using laravel framework and using this code for current moment as in the laravel documentation.
$expiresAt = Carbon::now()->addMinutes(10);
Cache::put('key', 'value', $expiresAt);
The problem with this code is, it stores cache on the browser only. But I want to store it on Server. I have heard about the Memcached but could not implement it. I have also heard of apc_store() but I think it stores on local. So, How can I store cache on server?
Similar to Cache::put(), you can use Cache::pull() to check for data saved (on the server).
// Check cache for data
$cachedData = Cache::pull($key);
// Get new data if no cache
if (! $cachedData) {
$newData = 'New Data';
$expiresAt = Carbon::now()->addMinutes(10);
// Save new data to cache
$cachedData = Cache::put($key, $newData, $expiresAt);
}
echo $cachedData;

Resources