Files are not changing when I update them via FTP - caching

I made some changes to a CSS file, uploaded it and saw no change. I cleared my browser's cache, repeated the process and still nothing. I also tried another browser and then experimented with other files - all the same result.I then deleted the CSS file altogether - the website still looks the same and I can still see the files in the browser's console.
I can only get results if I actually change the file names altogether (which is really inconvenient). I dont think there is an issue with FTP overwriting the file as there are no errors in FileZillas logs.
Is there another way a website can cache its self? Would anyone know why this is occurring?
EDIT:
I also tried this in cPanel's File Manager and viewed it on another PC - same result

Squid and other web accelerators often sit between a hosted server and your browser. Although they are supposed to invalidate their caches when the backing file changes, that information isn't always sent to specification or acted on properly.
Indeed, there can be multiple caches between you and the server each of which has a chance of hanging onto old data.

First, use Firebug or "Inspect Element" in chrome.
Verify that the css file that the browser is loading the file you think is should load.
Good luck.

Browsers can cache things, did you try SHIFT-F5 on your webpage to force a reload of everything?

Maybe the main server has cached configuration setup to other servers, check with your IT department. If this is the case, you need to tell them to invalidate the cache through all the cached servers.

I had the same issue with fileZilla to solve it you need to clear the file zilla cache or change the name of the files you are uploading.
Open FileZilla and click on the Edit menu.
Choose Clear Private Data.
In the new dialog box, check mark the categories you’d like to clear: Quickconnect history, Reconnect information, Site Manager entries, Transfer queue.
Finally, click OK to validate

Related

Umbraco backoffice cache

I have a problem in my Umbraco backoffice, where alot of the clientside files are cached heavily, which is causing some problems.
All the files loaded, is from /umbraco/Application and contains references like;
"/umbraco/lib/jquery/jquery.min.js?cdv=1",
"/umbraco/lib/angular/1.1.5/angular.min.js?cdv=1",
"/umbraco/lib/underscore/underscore-min.js?cdv=1",
But, how can i change the cdv value? I tried to change it under /config/ClientDependency.config, but nothing happend (it is not the same value in the config file and in the output above).
So maybe the backoffice is using a different config file for ClientDependency? Or any other ideas?
On my Umbraco 7.4.1 installation, changing the version of clientDependency.config worked for me:
From:
<clientDependency version="1" fileDependencyExtensions=".js,.css" loggerType="Umbraco.Web.UI.CdfLogger, umbraco">
To:
<clientDependency version="2" fileDependencyExtensions=".js,.css" loggerType="Umbraco.Web.UI.CdfLogger, umbraco">
After the change, the script files were being called as:
/umbraco/lib/jquery/jquery.min.js?cdv=2
/umbraco/lib/angular/1.1.5/angular.min.js?cdv=2
/umbraco/lib/underscore/underscore-min.js?cdv=2
Etc.
If it's only your browser with this issue (rather than a clients whose you don't have access to), then you can just force your browser to load new version of HTML, CSS and JS rather than using the internal HTTP cache.
How to tell if you're loading cached resources
You will know if your browser has cached the resources by looking at the Network tab in your browsers Dev Tools. If resources are being loaded with a 304 rather than a 200, they have been loaded from the internal cache.
Forcing reload on Chrome
There is a SO post here which tells us the difference between each reload option.
General browser shortcuts
There is also a bit of information here about how you can use a keyboard shortcut to force a hard refresh on your browser.
Plugins
Clear Site Cache has worked quite well as a browser plugin for Firefox for me.

Umbraco 7.2.0 - grid.editors.config.js is cached and will not update

I have created a new grid editor, and have deployed it to my production server. When on my development machine, a change to the grid.editors.config.js is reflected immediately.
However, on my production server, a change to grid.editors.config.js has no effect.
After some research, I have found that the issue is probably the client dependency cache. I have tried the following:
Removing the files from App_Data/TEMP/ClientDependency
Incrementing the version number in Config/ClientDependency.config
Recycling the application pool
Clearing the browser cache
Restarting the server
What am I missing? When I add a query string, ie. https://mywebsite/config/grid.editors.config.js?v=1 then the changes are shown, which means the file has definitely updated on the server.
What do I need to do to update the file?
Are you using any expiration headers for caching js on your website?
You could try to delete the following files:
App_Data/TEMP/DistCache
App_Data/TEMP/PluginCache
I find that it's a simple case of the browser caching your assets locally. You can usually force a refresh by pressing CTRL + F5 or holding CTRL and clicking refresh in your web browser and the changes are then visible.
As it turns out, the issue was caused by a third party that provides DDoS protection to the site - content was cached via the third party and so changes to files were not being reflected.

How to force the browser to show the most up to date files instead of relying on application cache?

It's very important for the website I'm working on to be offline-functional. I'm using a Cache Manifest to store all the files on the application cache, so that takes care of that and all is good and well.
BUT, as I read and noticed myself, the browser first shows the cached version of the site before checking for an update online. Hitting refresh reloads the cache again, with the new cached files this time (or what it had time to update for the swift refreshers).
I'm aware of this fix : http://www.html5rocks.com/en/tutorials/appcache/beginner/, where the user is told an update is available and is asked to refresh the page. Not a bad method, but still sketchy for user experience.
Is there any other way to force the browser to show the most up to date files if online? Would cache busting all files manually AND using a cache manifest fix this problem, or will it conflict with the cache manifest and cause problem to the offline functionality?
I found something that works well for me:
The URL linking to the web page contains a parameter. If there is ever a change to the page or related files, the url is changed to something like this: http:/ /www.mywebsite.com/mypage.html?v=3 where v=3 is changed depending on updates.
This is a longer fix to implement (finding every page affected by a change & changing all their cache busting links), but the pages at least show what they're supposed to on the first load and the cache manifest still load the update for offline viewing.

How to avoid occasional corrupted downloads

My website hosts a msi file that users need to download. There is nothing special about the file. It lives in a directory on the webserver with a regular HREF pointing to it that users click on. Occasionally a user will complain that they can't open the msi file because Windows Installer claims the file is corrupt. Redownloading the file doesn't help. I end up emailing the file as an attachment which usually works.
My guess is that the file is either corrupted in the user's browser cache or perhaps an intermediary proxy's cache which the user goes through.
Why does this happen? Is there a technique / best practice that will minimize chances of corruption or, perhaps make sure users will get a fresh copy of the file if it does get corrupted during download?
Well if the cause is really just the cache, then I think you could just rename the file before having them download it again. This would work for any proxies too.
Edit: Also, I believe most browsers won't cache pages unless the Get and Post parameters remain the same. The same probably applies to any URL in general. Try adding a unique get (or post) parameter to the end of the URL of each download. You could use the current time, or a random number, etc. Rather than a hyperlink, you could have a button that, when clicked, submits a form with a unique parameter to the download URL.
My advice would be:
Recommend users avoiding IE (especially the older versions), because of truncated downloads, cache pollution...
Advice user to clear the cache before re-downloading the files.
Host the file on an FTP instead of HTTP
Provide MD5 checksum for user to verify the download.

Programmatically reset browser cache in Ruby (or remove select item from browser cache)?

I would like to create a rake task or something to clear the browser cache. The issue is, I am running a Flash app, and if I change the data, I more often than not need to reset the browser cache so it removes the old swf and can see the new xml data.
How do you reset the browser cache with ruby? Or even more precisely, how can I only remove a select item from the browser cache?
Thanks for the help!
I see a few possible solutions:
Write some shell script that deletes the temporary files from disk out the cache (what browser are you using?). I'm am not sure deleting the files on disk will necessarily work if the browser has them cached in memory.
Use and HTTP header (No-Cache) to avoid caching in the browser, Adobe has documentation on No-Cache. You could set this header only in development mode, so that in production the swf is cached.
Depending on your browser, force a page and cache refresh (e.g. Crtl-F5 in Firefox)
I'm not sure how you're loading the xml data, but in the past, I've gotten around the issue by appending a random number to the path of the xml file:
xml.load("data.xml?"+Math.random());
Basically, Flash will always think the file is a different URL. It won't be able to find a match in your cache.
Again, I'm not sure how you're loading the XML data, so I'm not sure if this applies to your situation.
Hope it helps, though.
You cannot reset browser cache, even if you would sometimes it will not be sufficient because caching can occur not only on the server and/or client, but also on any number of nodes your response goes through on its way from your server to your client.
The only tool at your disposal is the caching headers.
You can set them to NoCache just keep in mind that it will be hitting the server every time
Since you're using Safari, here's an article describing how to use AppleScript to clear the cache. But you can probably just skip the AppleScript part and remove the files directly in the rake task. The only catch might be that you have to restart the browser for it to take affect, but that could be done with a kill on the process and an "open /Applications/Safari.app" (I'm assuming you're on a Mac; in Windows it would be something like start "c:\program files\Safari...").

Resources