Web page content is not gzipped on every computer - windows

I have configured my Apache xampp server to serve gzipped content as described [here][1]. This works successfully on some Firefox and Chrome browsers I have tried on different PCs (Windows and Ubuntu). I could verify this by looking at Network tab in DevTools on Firefox and Chrome browser, where I can see the reduced size that is transferred and also the header Content-Encoding :gzip I also passed the [GIDZipTest][2].
The problem is that for my PC and also another laptop I found (Windows 10) the content is not received as gzip by any browser, although the browsers send the request header that they accept gzip. The weird thing is that when I tested this in the Firefox browser of my Ubuntu VM it receives gzipped content but when I test this in the browser of the PC that hosts the VM it does not receive gzipped content.
I attach some pictures.
Firefox on my PC[3], Chrome on my PC[4], Firefox on VM[5]
https://ourcodeworld.com/articles/read/503/how-to-enable-gzip-compression-in-xampp-server
http://www.gidnetwork.com/tools/gzip-test.php
https://i.stack.imgur.com/9KLSO.png
https://i.stack.imgur.com/gcLsW.png
https://i.stack.imgur.com/UO9fA.png

Finally it worked by replacing http with https..I don't know why this is required, since I'm using exactly the same version of browser in both cases!
p.s.CBroe was right. If I look request headers more carefully I can see that in the browser that was not accepting gzip content, the header Accept-Encoding had one more value apart from gzip and deflate, which is br aka Brotli which seems to support compression only for https! Could this probably be the explanation? Although I didn't make any configuration in xampp for Brotli..

Related

unable do download file from ie11

unable do download file from ie11, when click on download button post request send to the server. I checked in network section the response will come with content of file but in internet explorer 11 download prompt will not appear instead of that if display error page can not display and url will not correct . This program working fine with Mozilla Firefox.
I created a demo program which runs on my local host with same configuration ie11 works fine. but same program I tried to run on my production server tomcat getting same error.
I had this same problem,but incase of PDF's my pdf file was not downloading from IE, in my case also the file was getting downloaded from the Firefox, I just checked the url, there is no such limit to the length of the url but in this I just reduced the length of the url and it worked, but I didn't get what IE does different in this maybe its the server which is handling the request from IE differently or while sending IE is doing something different, just check if this works, reduce the length of the url and try again.

GZip Compression Not working in internet explorer 11, but working fine in Chrome & Firefox

Co's,
I enabled GZip compression in my Spring Boot Embedded Tomcat using CompressingFilter (https://github.com/ziplet/ziplet) and FilterRegistrationBean from Spring.
It is working fine in Chrome & Firefox.
I am getting Content-Encoding = gzip in response headers
Transferred JSON data size is reduced from 6.5MB to 1.2 MB - Great :-)
But, the same code is NOT working in Internet Explorer 11.0.9600.18097.
In Internet Explorer,
Content-Encoding = gzip is missing in response headers
Transferred JSON data size is still 6.5MB only.
I have pasted my headers ( IE11 ) below.
Could anyone help me to figure out this issue?
Update:
Please find Chrome Headers below.
Internet Explorer's decompression logic happens at a level below the Developer Tools, so you may not see a Content-Encoding in the Developer Tools. You should consider using Fiddler to see what's actually on the wire.
Please have a look at the link. It was filed as an issue, but microsoft decided for whatever reason not to fix on IE11. They fixed it in the new Edge browser.
Please take a look at the comments in the link provided.

Why does Apache treat some files differently with regards to caching?

Base problem: I'm working on a web-app that pulls in an XML file via Ajax. Initially my issue was the file seeming to indefinitely cache in the browser and never issue an HTTP request at all. This is easily mitigated by adding the 'Cache-Control: max-age=0' request header. However, now I'm on the other side of the spectrum with the server always returning 200 along with the file contents and never a 304.
Debugging this with Fiddler has left me even more perplexed. Eventually I selected a request in Fiddler for a PNG file that was returning 304. I duplicated the request and then changed only the URL to point to the XML file instead. The XML file still however returns 200 despite having identical request headers as that of the 304 returning PNG file. Apache also returns many additional response headers for the XML file. The PNG file only returns Date, ETag, Server, Connection, and Keep-Alive. The XML file has all of these as well as Vary, Content-Length, Last-Modified, Accept-Ranges and Server.
I just want the XML file to be cached like any other file. If it's new on the server, return new content, otherwise respond with 304. So my question is, what settings are causing Apache to treat the XML file differently than the PNG file? Thanks.
EDIT - Per Covener's questions
No I didn't switch out the eTag, I can try doing that when I get
back in the office.
Vary: Accept-Encoding,User-Agent
No idea, I'm running a default install of Uniform server with Apache for local development with no modifications. (Content-length is probably absent since the server is returning 304 with no content.)
EDIT-2
I noticed that Apache doesn't GZip the PNG file (appears according to Fiddler that it would actually be larger, so makes sense). If I remove Accept-Encoding from the request as well as the '-gzip' from the etag, Apache will return 304 for the XML file. The GZip compression is 10 to 1 for this file, so ideally, it would be nice to have the compression but still send 304's when appropriate.

Chrome doesn't cache images/js/css

When Chrome loads my website, it checks the server for updated versions of files before it shows them. (Images/Javascript/CSS) It gets a 304 from the server because I never edit external javascript, css or images.
What I want it to do, is display the images without even checking the server.
Here are the headers:
Connection:keep-alive
Date:Tue, 03 Aug 2010 21:39:32 GMT
ETag:"2792c73-b1-48cd0909d96ed"
Expires:Thu, 02 Sep 2010 21:39:32 GMT
Server:Apache/Nginx/Varnish
How do I make it not check the server?
Make sure you have the disable cache checkbox unchecked/disabled in the Developer Tools.
What do your request headers look like?
Chrome will set max-age:0 on the request's Cache-Control header if you press Enter in the location bar. If you visit your page using a hyperlink, it should use the cache, as expected.
Wow! I was facing the same issue for quite some time.
I'll tell you why you were facing this issue. Your headers are just fine. You receive a 304 because of the way you are trying to refresh the page. There a mainly 3 ways -
Press enter in the address box. You will observe chrome reads the file from the cache first and does not go to the server at all.
Press f5, this would verify if the file has become stale (probably that is how you are refreshing)
Press Ctrl+f5, this is unconditional reload of all static resources.
So basically - you should press the return key in the address bar. Let me know if this works.
For me, it was self-signed certificate:
https://code.google.com/p/chromium/issues/detail?id=110649
In the above link the Chromium Developer marked the bug: #WontFix because the rule is: "Any error with the certificate means the page will not be cached."
Therefore Chrome doesn't cache resources from servers with self-signed certificate.
If you want Chrome to cache your JS/CSS files - the server will need to set a "Cache-Control" header. It should look like:
Cache-Control:max-age=86400 (if you want to cache resources for a day).
I believe you are looking for
Cache-Control: immutable

Debug static file requests from IIS6

How can I debug what is being returned by IIS(6) when the response goes through proxies before getting to the browser?
I have requests for static files which are being sent with the 'Accept-encoding: gzip' header. These are being gzipped correctly. However, if a 'Via: ' header (to redirect the response via a proxy) is also included the content is not received gzipped by the browser.
I want to know if the issue is with IIS not applying the compression or related to something the proxy is doing.
How can I investigate this problem?
This is related to IIS6 not doing gzip compression when including Via header in request.
In case anyone else hits this problem i believe it's due to the "HcNoCompressionForProxies" option, configurable in the metabase. It specifically lets you disable compression for proxied requests.
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/05f67ae3-fab6-4822-8465-313d4a3a473e.mspx?mfr=true
If your still interested my answer would be install Fiddler probably on the client first. For HTML snooping you can't do much better.
That would be my first port of call.

Resources