Magento - Google chrome sets new session - unable to login as customer - magento

I'am not at the point where my hair starts to turn gray..
Two sessions is created by magento, this seems to be right, works in IE/FF:
PHPSESSID
hqndmkildduflb04lpgohu6pk5
www.domain.com
/
Tue, 12 Mar 2013 11:31:57 GMT
35
PHPSESSID
hqndmkildduflb04lpgohu6pk5
.www.domain.com
/
Tue, 12 Mar 2013 11:31:56 GMT
The strange thing is, when logging out and closing the browser, reopen and go to the login site again, another session is now created, and I'm now unable to log in:
PHPSESSID
ru9lvno0mt8kpj6lhb2g3vmlq3
.domain.com
/
Tue, 12 Mar 2013 11:42:51 GMT
35
When deleting the 3 sessions, I can login again, creating two new sessions. This only happens in Chrome.

I would guess that it actually has something to do with cookies.
In System->Config->Web->SessionCookieManagement try setting the path to '' (empty), domain to 'domain.com', use HTTP only to 'YES', and Cookie Restriction Mode to 'NO'.

Related

Download fails, is it wrong with the keep-alive?

Downloading a file directly (without any scripts) from the server, sometimes it fails for slow internet connections.
e,g for any file, each time users can download some percent of the file. And sometimes they can download it completely.
In the cases that download fails, browser doesn't show a "Failed" text. it seems the file has been downloaded completely.
I thought it might be with the keep-alive connection. I use these response headers:
Accept-Ranges:bytes
Cache-Control:max-age=172800
Connection:Keep-Alive
Content-Length:36195412
Content-Type:application/x-rar-compressed
Date:Tue, 28 Jul 2015 07:00:49 GMT
ETag:"a9825a8-2284c54-518bca9ddaaad"
Expires:Thu, 30 Jul 2015 07:00:49 GMT
Keep-Alive:timeout=5
Last-Modified:Wed, 17 Jun 2015 20:37:46 GMT
Server:Apache/2.4.12
Vary:User-Agent
And one more question.You can see in the headers that keep-alive is set like this: keep-alive:timeout=5 there is no max here. so is it default to 0 or something else?
Thanks.
Well, I figured out that it is not related to keep-alive. I and most of our clients were using proxies, it made some problems.

Browser behavior when no cache-control policy header defined

I'm trying to improve the caching policy for a web site.
I'm implementing an aggressive caching strategy first, for resources that won't change at all, and would like files like jquery-min for example to be downloaded only once, and then be served from browser cache.
I used Apache Mod-Expire module to accomplish this, and it's working pretty well.
However, I'm surprised that when I completely remove my cache-control configuration, Firefox browser does not re-download jquery file : it already gets it from the cache, without any instruction from me. How is Firefox making this kind of decision ? Is it related to the ETag hash ?
Here are the headers I get without my configuration for that particular file :
Accept-Ranges bytes
Connection Keep-Alive
Content-Encoding gzip
Content-Length 27073
Content-Type application/javascript
Date Fri, 23 Aug 2013 09:48:06 GMT
Etag "225f8-13309-4e385823c7b80"
Keep-Alive timeout=15, max=99
Last-Modified Fri, 09 Aug 2013 15:34:22 GMT
Server Apache
Vary Accept-Encoding
Can anyone explain to me what is the browser behavior when there's no Cache-Control policy defined ? is it browser dependant ?
Thank you,
Mathieu.

Default expiry date of all types of file

Despite of doing it in each file type handler, is there any simple way to set default expiry date of all types of file, e.g. json, jpg, html, css, js, etc?
As Nagi said, the connection handler called /handlers/main.c can be used to generate custom expiration HTTP headers.
You can also use content-type handlers which name in the /handlers directory must match the file type (html.c, json.c, jpg.c, etc.).
By default, G-WAN generates HTTP headers which may help proxy servers to do their job:
Date: Thu, 29 Nov 2012 15:00:55 GMT
Last-Modified: Sun, 25 Nov 2012 13:54:46 GMT
ETag: "810c7fa9--50b22326-7ec3"
But a more fine-grained strategy can be used. That could be done in a future version by defining those content-related expirations in content-type handlers.
Proof that a server is a living (customer-driven) creature:

CakePHP Session updates but cookie expiry doesn't

Short Question:
Why doesn't my session cookie's expiry time get updated in the browser when my session's expirty time is updated on the server?
Long Question:
I posted a similar question about this a few weeks ago but I didn't have all of the facts at the time. I now have more detail and the nature of the question has changed so I'm posting it as a new question.
First of all, in CakePHP 2, I've set up APP/Config/core.php with the following for the session:
Configure::write('Session', array(
'defaults' => 'database',
'cookie' => 'mycookie',
'timeout' => 1 // 1 minute - just for testing
));
So, I load a page which in my app which creates the session in the database. All good so far.
The session is stamped to expire at 1341288066 which is equal to Tue, 03 Jul 2012 04:01:06 GMT. Again, this is great because that's 1 minute from now. Exactly what I wanted.
If I look in Firefox's cookie screen, I find the cookie just as I would have expected it:
Name: mycookie
Content: aqm0gkmjfsuqje019at8cgsrv3
Host: localhost
Path: /
Send for: Any type of connection
Expires: Tue 03 Jul 2012 11:01:06 AM ICT // (04:01:06 GMT)
Now, within this 1-minute window, I go back to my app and refresh the page. Then, I check the session to see if it's updated. It shows 1341288122 against the session id aqm0gkmjfsuqje019at8cgsrv3 which is equal to Tue, 03 Jul 2012 04:02:02 GMT which, again, is what I expected. The expiry of the session has been updated to be 1 minute from when I last reloaded the page.
Unfortunately, the cookie in the browser is still set to Expires: Tue 03 Jul 2012 11:01:06 AM ICT (ie: 04:01:06 GMT) and that's exactly what it does, meaning that the next time I press refresh, Cake generates a brand new session ID even though the old one is still technically valid.
My question is basically what is going on here? Why doesn't the cookie get updated with the new expiry date in the browser?
The issue you have spotted is indeed unexpected and ends sessions where they should stay alive.
This is the result of how CakePHP uses the Session functions of PHP. There is an entry (#3047) in the CakePHP bugtracker, where Mark Story (CakePHP developer) agrees this should be fixed
I can agree that the cookies should be updated alongside the session times stored in the session. However, that's not how PHP's internal features for session handling work. There seem to be a few different ways to workaround this issue.
As this will change the current behavior (however weird it may be), the fix is postponed to version 2.3, though.
I think managing the cookie state outside of PHP is going to be the most appropriate solution. I don't know how safe of a change this is for existing applications though. Changing how sessions works can be dramatic change and allowing users to stay logge din much longer might not be what all developers are expecting.
This appears to be how PHP handles sessions. PHP does not update the cookie on each request (see: http://php.net/manual/en/function.session-set-cookie-params.php#100672). Instead of relying on the expiry time in this cookie, CakePHP compares the current time with the actual session timeout in Session::_validAgentAndTime().
The problem can be solved by using the two parameters in combination.
Configure::write('Session', array(
'cookie' => 'CAKEPHP',
'defaults' => 'php',
'timeout' => 60, // 60 minutes: Actual Session Timeout
'cookieTimeout' => 1440, // 1440 minutes: 24 hrs: Actual Cookie Timeout
'autoRegenerate' => true,
'requestCountdown' => 1,
'checkAgent' => false,
));
autoRegenerate: generates Session Cookie after refresh. The refresh count after which the Session Cookie should be regenerated is determined by the next parameter.
requestCountdown: keep the value of this parameter as low as possible. This is the number of refresh/reload after which the Session Cookie will regenerated.

How to set a future Cache-Control Expires on Assets in Play 2.0

I'm trying to set a future Expires Cache on public assets as per YSlow guidelines, to enable loading from cache and improve performance a bit.
As per documentation (see Cache-Control at the bottom) this should work:
"assets.cache./public/javascripts/bootstrap.min.js"="max-age=315360000"
But it doesn't, when I check the Response I get:
Data Size 82002
Device disk
Expires Thu Jan 01 1970 01:00:00 GMT+0100 (IST)
Fetch Count 220
Last Fetched Sat Feb 25 2012 15:04:04 GMT+0000 (GMT)
Last Modified Sat Feb 25 2012 15:04:04 GMT+0000 (GMT)
My file is stored under /public/javascripts/bootstrap.min.js
My routes entry is the default one:
# Map static resources from the /public folder to the /assets URL path
GET /assets/*file controllers.Assets.at(path="/public", file)
It seems that the config should work if I read the source code for assets.
Any idea on what I'm missing or how to make it work?
Issue solved: the browser was retrieving the elements from it's own cache and wasn't updating the Expires entry.
After not using the project for couple of hours and without any changes, it worked.
Oh, well...

Resources