Fetch Still Hit After Purged Edge-Cache - caching

I have this API link that cloudflare cache in edge, and everytime the data(s) updated the backend will send to cloudflare's api to purge this link.
When I try with postman it work perfectly fine, got MISS when new data(s) came and got cached when try to fetch afterwards.
But when building frontend website using fetch no matter how hard I try it will always "HIT" cloudflare's cache even though my postman got "MISS".
(Image) Postman "MISS"
(Image) Fetch Always "HIT"

Providing more details about your testing method would allow better guess. But offhand, sounds like after purging the cache you are testing Postman first each time? In which case makes sense that it would be a MISS, then followed up by browser test, it would be loaded into cache (from the Postman request)?

Related

Prevent AJAX POST responses from being cached

I have AJAX POST requests generated from my webpage, and there may be multiple post requests with the same post data. But the response may vary, and I want to make sure I am not getting cached responses to any of these requests. I need each request to hit the webpage.
Am I right in assuming that responses to POST requests will not be cached?
There is two level of caching will be involved in that process
Browser caching
Server caching
To eliminate first one you have to cheat your browser and add a fake parameter to your ajax request so it will think it's unique each time i.e
www.example.com/api/ajax?123
www.example.com/api/ajax?1234
For server level you have to make sure that no cache been added to your configuration for such link, for example some developer will cache any file ends with .json or service like Cloud Flare it will automatically cache any static content.

how do you do logging if your app is put in front of varnish

The logic behind varnish is that it never touches your ruby/php code base and served directly by the cache. What if I have an ecommerce site and for each category page I want to log a particular page viewed by user/ip address X at what time, I have put this logging code in my php code. However when I run the app with Varnish then I lost all of this ability. I am pretty new to gateway proxy cache, can anyone enlighten me?
The easiest most efficient way to solve this is to create an ajax request that does just the logging part. That way you can still cache your whole page whereas you disable cache for the ajax request to enable it to log all users. The IP you would forward from varnish to the ajax request (with X-Forwarded-For), the URL you easily get with javascript and include in the ajax call (browser referral headers are not reliable).
A rather simple thing is to write a script to parse varnish logs, and retrieve pages of interests, ips and other interesting informations. It can be run once a day or more frequently, depends on your needs.
By using ajax request as in #Clarence response you risk to do not include the visitors with javascript not activated (but you can have your stats in real time)
You can add some logic to Varnish to forward IP address, so you could have complete webserver logs of client IP & requested URL.
This example is for apache: Varnish Client IP not logging in Apache Logs
[Edit] The above suggestion only works for cache misses. Client side JS recommended.
You could also consider using javascript to poll servers with information, such as Google Analytics. http://www.google.com/analytics/

Azure and CORS Access-Control-Allow-Origin with ajax and php

First I'm not in the web side of our world, so be nice with the backend guy.
A quick background : For a personal need I've developped a google chrome extension. They are basically a webpage loaded in a chrome windows and... yeah that's it. Everything is on the client side (scripts, styles, images, etc...) Only the data are coming from a server through ajax calls. A cron job call a php script every hours to generate two files. One, data.json contains the "latest" datas in a json format. Another one hash.json contain the hash of the data. The client chrome application use local storage. If the remote hash differ from the local one, he simply retrieve the data file from the remote server.
As I have a BizSpark account with Azure my first idea was : Azure Web Site with php for the script, a simple homepage and the generated file and the Azure Scheduler for the jobs.
I've developed everything locally and everything is running fine... but once on the azure plateform I get this error
XMLHttpRequest cannot load http://tso-mc-ws.azurewebsites.net/Core/hash.json. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:23415' is therefore not allowed access.
But what I really can't understand is that I'm able (and you'll be too) to get the file with my browser... So I just don't get it... I've also tried based on some post I've found on SO and other site to manipulate the config, add extra headers, nothing seems to be working...
Any idea ?
But what I really can't understand is that I'm able (and you'll be
too) to get the file with my browser... So I just don't get it
So when you type in http://tso-mc-ws.azurewebsites.net/Core/hash.json in your browser's address bar, it is not a cross-domain request. However when you make an AJAX request from an application which is running in a different domain (http://localhost:23415 in your case), that's a cross-domain request and because CORS is not enabled on your website, you get the error.
As far as enabling CORS is concerned, please take a look at this thread: HTTP OPTIONS request on Azure Websites fails due to CORS. I've never worked with PHP/Azure Websites so I may be wrong with this link but hopefully it should point you in the right direction.
Ok, will perhap's be little troll answer but not my point (I'm .net consultant so... nothing against MS).
I pick a linux azure virtual machine, installed apache and php, configure apache, set some rights and define the header for the CROS and configure a cron in +/- 30minutes... As my goal is to get it running the problem is solved, it's running.

GWT caching - a cookies approach

I'am trying to improve the performance of my gwt app.
My app uses a lot of rpc request, so i am trying to caching them in the client.
Each rpc request returns a list of records (normally 100 records). I'am storing them in the client as a Java List, but I notice that the browser can not carry ok with this amount of objects. It performance cracks.
I'am thinking of storing the result of each request into a cookie using some kind of JSON and retrieving it when needed. In other words, caching the request in cookies better than in the RAM of the client browser.
Can somebody suggest me anything?
Will I success by following this approach or is this a stupid think?
Does anybody has a better solution?
Maybe you want to have a look at this question: Client side caching in GWT
Cookies are actually a terrible place to store stuff, because they get sent to the server on every request, even RPC (Ajax) requests.
I think what you want is local storage, which has some kind of implementation in every modern browser. Not sure about older IE's but you could fall back to Flash for local storage there.
Edit
GWT doesn't have any native support for local storage (that I know of) but you could always use JSNI for that.
Also storing stuff in JS is a perfectly valid way to do it, you just need to make sure stuff expires and falls out of the cache. Is your current cache solution just growing forever? Because that certainly will kill the browser eventually.
The cookie is sent as a field in the header of the HTTP response by a web server to a web browser and then sent back unchanged by the browser each time it accesses that server.
think about the traffic...
#darkporter I completely agree and would like to add that Web Storage support are coming in the next GWT 2.3 release. So if you need it right now, you could pull those classes from gwt trunk and use them in your project.

Saving all browser's (any) AJAX activity to disk

I'm developing web application that uses AJAX a lot. It sends several AJAX requests to server per second and update page's content based on the request result.
Is there a way I can save all responses sent from server to browser using AJAX to disc in order to analyze it later?
It is not good to make 'alert' after each request because there are too many requests and I'll not be able to see if the browser works well in this case.
I've tried to use firebug or greasemonkey for Firefox but I failed to save data to disk using them.
Is there any solution for it? Thanks for your help.
Have you tried Fiddler?
It's an application which monitors all http traffic. It's good, and it's free.

Resources