When trying to access a protected route or just refreshing the page the session is lost, so I have to login again. What I don't understand is that sometimes this problem does not happen but most of the time it does and sometimes it takes more than 3 times of doing the login before I can finally access a protected route. This only happens in production. I have no idea but it started only after I moved my hosting to Cloudways and users start complaining.I have other Laravel app with version 5.4 on the same server without problems.
1- change your session driver value to database then the user's session will be stored inside your database, sessions table instead of file:
in your .env file set :
SESSION_DRIVER=Database
2- clean cache and config
php artisan config:clear
php artisan cache:clear
Related
I've moved a laravel app form a domain to another. All works well but I noticed, after clicked on subitting a button, that it spend 20seconds to refresh the paige. During this the system is waiting for an external components (addthis.com, google ads etc..), end when solved the process in console I read the "Blocked: Storage access requests from trackers" message.
I've setup session.php to 'same_site' => 'lax' (it was null..) but nothing happends.
Do you have some idea?
How to include safe url list as walk-artoud it ?
Thanks
Hi after this change you should clear the cache of config laravel
on your console you write :
php artisan config:clear
to permit accept the changes and then clear the cache of your application with
php artisan cache:clear
and see if this works.
I was tried with php artisan optimize:clear with no results, but as you suggest it works right, thank you a lot!
Have I repeat this periodically or only if I update session.php?
When using session on my local environment all worked ok, but when I publish the site in a shared hosting I started noticing some strange behaviours in the app. After a while I realized it has to do with the session and specifically I noticed that when I was working on my local environment the storage/framework/sessions folder only had 1 file that keep updating on any change but then on production I start monitoring the same folder and I realized that on any change instead of updating the file (or creating a new one and deleting the other) it was creating a new file but also keeping the old files making the app start acting in a wrong way.
Is this normal or should it be only 1 file per session as it was in the local environment?
Update
After login the user the app ask to select the business they want to work and also they can change between business after, to store the business they choose I use the session and there is where the problem pop, after every change on that property of the session it creates a new session file without deleting the old one. Again when I do exactly the same thing locally it works but for some reason on the shared hosting it doesn't.
SOLUTION
After days of trying to figure it out, I just figure out the solution.
Instead of using the Global Helpers of Laravel for storing the data I did it throw the request and apparently that work it out.
So basically instead of doing this:
session('clienteElegido' => $client);
I change it for this:
$request->session()->put('clienteElegido',$client);
I still don't understand what's the difference and why it was working fine in my local environment and not in the share host but its working now like that so all good.
Thank you for all the quick replies.
Try clearning cache, route, config and view
php artisan cache:clear
php artisan route:clear
php artisan config:clear
php artisan view:clear
and let's see if your session issue will be fixed.
The fact is that I am connecting to sockets from another domain and have been trying for a long time to find out why I get 419 error when connecting to a private channel. It turns out that the csrf token is cached and I can successfully connect when the routes are not cached.
I don't know of a way to be selective about it but try running:
php artisan route:cache
php artisan cache:clear
Or for the whole set of cache clears you can run php artisan optimize
The other option would be to wait out the cache expiry but I doubt you would want to do that!
There is a problem with my Laravel App, whenever a user logs in it logs the user in.
At this point if i return a view it still stays logged in but if i redirect to any route at all (protected or not), it automatically logs the user out and redirects to login page.
You session and auth guard is not configured properly please check this and also you can use laravel auth package for login/register
So i resolved it following this :
SOLUTION:
Go to your .env file and change SESSION_DRIVER=file to SESSION_DRIVER=database.
Next you will need to create a session migration: php artisan session:table.
Now composer dump-autoload for good practice.
Finally migrate (php artisan migrate).
I have a issue in my laravel applications, i'm using XAMPP test run my applications locally on windows machine.
My problem is that Laravel applications get's logout suddendly, this happens very often, if i refresh the same page like 4 , 5 times it get's logged off. but when i check the chrome dev tools Cookies, Laravel_session is still there.
Any idea as to why this could happen?, also, i don't get the same issue when it's hosted online.
May be the problem is with session driver. Because as you said that if i refresh the same page like 4 , 5 times it get's logged off, by default session storage is file and you 4-5 time request may messing it up. Here I'm not sure but this stuff may solve your issue.
Change SESSION_DRIVER=file to SESSION_DRIVER=database in .env
Create session migration with php artisan session:table artisan
code
Run composer dump-autoload to regenerates the list of all
classes
Run php artisan migrate to migrate.
NOTE: You can run single migration file by separate with subfolder