Cloudflare caching resources for which cookie is set - caching

I'm setting some cookies and appropriate cache-headers (to not cache) while responding a requested .gif file. According to Cloudflare documentation:
If the Cache-Control header is set to "private", "no-store", "no-cache", or "max-age=0", or if there is a cookie in the response, then Cloudflare will not cache the resource.
I know that there's a default caching of .gif files but I can see 'CF-Cache-Status:MISS' in response-header. So, the issue is not caching but that the cookie is not being forwarded to client from my backend server.
Response Header sent by my backend server:
Cache-Control:no-cache, no-store, must-revalidate
Content-Type:image/gif
Content-Length:12345
Date:Wed, 12 Jul 2017 13:50:29 GMT
Expires:Thu, 01 Jan 1970 00:00:00 GMT
Pragma:no-cache
Set-Cookie:user=randomuser; Max-Age=7776000; Expires=Tue, 10-Oct-2017 13:38:45 GMT
Response Header sent by Cloudflare:
Cache-Control:no-cache, no-store, must-revalidate
CF-Cache-Status:MISS
CF-RAY:xxxxxxxxxxxxx-XXX
Connection:keep-alive
Content-Type:image/gif
Content-Length:12345
Date:Wed, 12 Jul 2017 13:50:29 GMT
Expires:Thu, 01 Jan 1970 00:00:00 GMT
Pragma:no-cache
Server:cloudflare-nginx
Vary:Accept-Encoding
As can be seen the 'Set-Cookie' header is not forwarded to my clients by Cloudflare.
Although this issue got resolved when I created a page rule and set Bypass Cache for the resource-url.
Can anyone guide me what am I missing or additional headers to send from my backend server so that 'Set-Cookie' is also forwarded through Cloudflare?

Related

Disable caching in open-uri

I have to, sadly, poll an endpoint and update another system when the data changes. I wrote a loop (with a sleep statement so I don’t DOS the server):
require 'nokogiri'
require 'open-uri'
desired_data = 'foo'
data = nil
url = nil
while data != desired_data do
sleep(2)
url = "https://elections.wi.gov/index.php/elections-voting/statistics"
doc = Nokogiri::HTML.parse(open(url))
puts doc
# do some nokogiri stuff to extract the information I want.
# store information to `data` variable.
end
# if control is here it means the data changed
This works fine except when the server updates, open(url) still returns the old content (even if I restart the script).
It seems like there may be some caching at play. How do I disable it?
Here are the HTTP headers returned:
HTTP/2 200
date: Fri, 02 Oct 2020 14:00:44 GMT
content-type: text/html; charset=UTF-8
set-cookie: __cfduid=dd8fca84d468814dd199dfc08d45c98831601647244; expires=Sun, 01-Nov-20 14:00:44 GMT; path=/; domain=.elections.wi.gov; HttpOnly; SameSite=Lax; Secure
x-powered-by: PHP/7.2.24
cache-control: max-age=3600, public
x-drupal-dynamic-cache: MISS
link: <https://elections.wi.gov/index.php/elections-voting/statistics>; rel="canonical"
x-ua-compatible: IE=edge
content-language: en
x-content-type-options: nosniff
x-frame-options: SAMEORIGIN
expires: Sun, 19 Nov 1978 05:00:00 GMT
last-modified: Fri, 02 Oct 2020 12:47:38 GMT
vary: Cookie
x-generator: Drupal 8 (https://www.drupal.org)
x-drupal-cache: HIT
x-speed-cache: HIT
x-speed-cache-key: /index.php/elections-voting/statistics
x-nocache: Cache
x-this-proto: https
x-server-name: elections.wi.gov
access-control-allow-origin: *
x-xss-protection: 1; mode=block
cache-control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
cf-cache-status: DYNAMIC
cf-request-id: 058b368b9f00002ff234177200000001
expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
server: cloudflare
cf-ray: 5dbef38c3b6a2ff2-ORD```
If it matters, I’m using Ruby 2.7 on macOS Big Sur.
It might be a problem on the Drupal 8 website itself as it has its own cache manager - and it seems like there's a cache per user somewhere if you have new content using a web browser.
It is easy to see which cache contexts a certain page varies by and which cache tags it is invalidated by: one must only look at the X-Drupal-Cache-Contexts and X-Drupal-Cache-Tags headers!
But those headers are not available in your list. If you're in touch with the website's developers ask them to do the following:
You can debug cacheable responses (responses that implement this interface, which may be cached by Page Cache or Dynamic Page Cache) by setting the http.response.debug_cacheability_headers container parameter to true, in your services.yml. Followed by a container rebuild, which is necessary when changing a container parameter.
That will cause Drupal to send X-Drupal-Cache-Tags, X-Drupal-Cache-Contexts headers.

Disable caching of content in firefox offline mode

I am working on a web application which has user management in place. I find a concerning issue in firefox related to Work Offline. Following are the steps describing the scenario:
User logs in to the application
User performs some action and logs out of the application
If the user now enables Work Offline mode in firefox, he/she can use browser back to access the last page. However, this page is supposed to be secure.
In my opinion this is a data security issue as any other user can apply this technique to fetch valuable information of the last user.
I have used cache control headers to communicate to the browser that HTML content should not be cached. Following are the response headers used:
HTTP/1.1 200 OK
Date: Tue, 05 May 2015 10:39:30 GMT
Server: Apache/2.4.9 (Unix) OpenSSL/0.9.8za
Cache-Control: no-cache, no-store
Expires: Wed, 31 Dec 1969 23:59:59 GMT
Content-Type: text/html;charset=UTF-8
Content-Language: en
Vary: Accept-Encoding
Content-Encoding: gzip
X-Frame-Options: SAMEORIGIN
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
I have used
Cache-Control: no-cache, no-store
Expires: Wed, 31 Dec 1969 23:59:59 GMT
I have noted this vulnerability in applications like Facebook. Is this resolvable? Thank you.

Images loading from akamai not caching in the browser

Images loading from akamai not caching in the browser.
when looking through developer window i see this in the header:
Accept-Ranges bytes
Access-Control-Allow-Cred... true
Access-Control-Allow-Orig... *
Access-Control-Request-He... X-Requested-With,Content-Type,Accept,Origin
*
Cache-Control max-age=0, no-cache, no-store
*
Connection Keep-Alive
Content-Length 114069
Content-Type image/jpeg
Date Tue, 02 Jul 2013 14:20:52 GMT
Etag "01bd6c5172ce1:0"
Expires Tue, 02 Jul 2013 14:20:52 GMT
Last-Modified Wed, 26 Jun 2013 00:11:58 GMT
Pragma no-cache
Server Microsoft-IIS/7.5
Set-Cookie BNI__BARRACUDA_LB_COOKIE=00000000000000000000000097f7ab4200005000; Path=/;
HttpOnly
*X-CFLO-Cache-Result* TCP_NC_MISS
X-Powered-By ASP.NET
What can i do to forces Akamai Servers to change the images header so it can be cached in the browser.
Generally Akamai preserve Cache-Control header. If you don't setup Cache-Control your application/web server, is very recommendation that you do. But, if you want create specific policies using Akamai, you will need to enable the "Set Browser Cache Control Headers".

Why caching ajax request is not working?

I have a large json data object (over 300K uncompressed, 40K gzipped) which is used on every page of my internal system. I want to fetch it every 15 minutes. In this time user will probably visit tens of pages of my system.
The HTTP Response headers in Firebug look like:
Cache-Control max-age=899, public
Connection Keep-Alive
Content-Encoding gzip
Content-Length 44017
Content-Type text/html; charset=iso-8859-2
Date Tue, 04 Dec 2012 16:21:45 GMT
Expires Tue, 04 Dec 12 17:36:45 +0100
Keep-Alive timeout=15, max=99
Last-Modified Tue, 04 Dec 12 17:21:45 +0100
Pragma no-cache
Server Apache
Set-Cookie user_auth=xxx; expires=Wed, 12-Dec-2012 16:21:45 GMT; path=/; domain=example.com
Vary Accept-Encoding
X-Genaration-Time 0.13282179832458 sec.
X-Genarator vCRM 3.1 (c) Veracomp S.A.
X-Powered-By PHP/5.3.3-7+squeeze14
The cache headers are set to 15 minutes in future, but neither Chrome nor Firefox caches it.
The Firebug says this about cache:
Data Size 44017
Device disk
Expires Thu Jan 01 1970 01:00:00 GMT+0100
Fetch Count 5
Last Fetched Tue Dec 04 2012 17:21:45 GMT+0100
Last Modified Tue Dec 04 2012 17:21:45 GMT+0100
It seams the Expires header is ignored, but why?
This sould not mattery, but I better write, that the content type is text/html, so the server can gzip it, but in reality the content is JSON.
I am using Prototype.js to request this. I set the request header:
Cache-control: max-age=900
Prototype.js does not add any cache buster parameter to the url.
I am using PHP with Zend_Framework to set serve the response.
What am I doing wrong?
Problem solved.
As #Victor pointed out, I had the "Pragma: no-cache" header set. The header was somehow set by PHP. I worked with our webserver admin and we had managed to unset the header with Apache.
Still it was not enough. Our framework sets cookies with every page refresh and I could not turn it of. Browsers did not want to cache the requests which set cookies. We had to unset the Set-Cookie headers too.
Finally this two unsets allowed us to enable cache:
<LocationMatch "(?i)/url/we/want/to/be/cached/.+">
Header unset Pragma
Header unset Set-Cookie
</LocationMatch>

Apache2 response headers http

Good Day.
I'm trying to figure out a caching issue. We are currently using a CMS with built in caching. We can then delete the cached copy via the control panel and the uncached page will be served until we cache that page again.
Long story short, we know that a either a proxy server or load balancer was put in place, and we think that it is caching the pages as well as the CMS. Our specific issue is that when we un-cache the page in the CMS, it's taking 15 minutes (timed) to show up un-cached (even after using different browser, clearing the browser cache, etc.), when prior to the network appliance being introduced, the un-cached page would show up immediately. Unfortunately we don't have any historical response headers saved anywhere.
When we believe that the page is being cached by the proxy/lb, the Response header is returning:
HTTP/1.1 304 Not Modified
Server: Apache/2.0.59 (Unix) JRun/4.0 mod_ssl/2.0.59 OpenSSL/0.9.8k PHP/5.2.6
Last-Modified: Fri, 03 Aug 2012 13:29:12 GMT
Etag: "92fe-18f7-837ada00"
Accept-Ranges: bytes
Keep-Alive: timeout=5, max=100
Content-Type: text/html
Content-Encoding: gzip
Connection: Keep-Alive
Date: Mon, 06 Aug 2012 13:49:40 GMT
X-Cntnt-Length: 6391
When it's not being cached by the CMS, the response headers are:
HTTP/1.0 200 OK
Date: Mon, 06 Aug 2012 14:03:59 GMT
Server: Apache/2.0.59 (Unix) JRun/4.0 mod_ssl/2.0.59 OpenSSL/0.9.8k PHP/5.2.6
X-Powered-By: PHP/5.2.6
Set-Cookie: blah-blah-blah
Expires: Mon, 26 Jul 1997 05:00:00 GMT
Last-Modified: Mon, 06 Aug 2012 14:04:04 GMT
Pragma: no-cache
Connection: close
Content-Type: text/html
I guess my question is, can Apache2 be configured to return both HTTP 1.0 and 1.1 ?
I know the Etag indicates a cached page, and believe it's not available in HTTP/1.0.
Thanks for any insights.

Resources