I recently tried using gzip compression to improve web UI performance. I configured Tomcat Connector as below.
compression="on"
compressionMinSize="2048"
noCompressionUserAgents="gozilla, traviata"
compressableMimeType="text/html,text/xml,text/css,text/javascript,text/json,application/x-javascript,application/javascript,application/json"
Below is RequestHeader - Accept-Encoding is gzip, deflate.
Key Value
Request GET /app/jquery-ui.min.js HTTP/1.1
Accept */*
Referer https://cdduat.app.com/Apptech/
Accept-Language en-US
User-Agent Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; Trident/7.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; InfoPath.3)
Accept-Encoding gzip, deflate
Host cdduat.app.com
Connection Keep-Alive
Cache-Control no-cache
Cookie JSESSIONID=CB793FFEE9A34B5B8E7DE34A17C90B5D; mbox=session#1436174197635-942865#1436176058|PC#1436174197635-942865.28_07#1437383802; s_fid=498342221B10B4ED-297E46742B9393BE; s_vi=[CS]v1|2ACD23BB851D5DBF-40001903C00C9391[CE]; oo_event_entry=41eebf1007f6e19f5b0ee4b5841be2441e970f9c
For response header - there is no Accept-enconding key value. Moreover I'm not sure if its working or not. Below is Response Header. Web form load time is still the same. Not sure if I am doing anything wrong here.
Server Apache-Coyote/1.1
Accept-Ranges bytes
ETag W/"238326-1435860126000"
Last-Modified Thu, 02 Jul 2015 18:02:06 GMT
Content-Type application/javascript
Content-Length 238326
Date Mon, 13 Jul 2015 22:22:23 GMT
Also I could see Cache-control value as no-cache. Does it mean browser wont cache JS files and request for those resources again on subsequent request?
https://serverfault.com/questions/707207/apache-2-4-tomcat-7-0-57-js-and-css-are-not-getting-gzipped-in-response
Related
I have a question about windows authentication with IIS and HttpListner
I have following setup (All installed in same windows 8.1 box. No outside communication). All requests are sent as http://localhost/......
IIS
ASP.Net web application authentication
Anonymous: Disabled
Windows Authentication: Enabled
.Net httpListener
running as a Service run as local System Account. and enabled windows authentication
this.httpListener = new HttpListener();
this.httpListener.AuthenticationSchemes = AuthenticationSchemes.IntegratedWindowsAuthentication;
UWP Application (Windows 8.1)
UWP Application is just like a web browser. It has WebView control to see web contents.
Following capabilities enabled
Enterprise Authentication
Internet (Client)
Location
Private Networks (Client & Server)
Problem
When I navigate from the uwp app to the IIS web app it is asking for the credentials by popping up windows dialog. This is annoying for the user experience perspective because user is logged in with same credencials. But When I access Http Listener it is authenticate correctly and no credentials dialog.
Also I checked the requests through fiddler. Initial request is identical, but with next steps for IIS request, it is continuously asking for NTLM.
HTTP/1.1 401 Unauthorized
Cache-Control: private
Content-Type: text/html; charset=utf-8
Server: Microsoft-IIS/8.5
WWW-Authenticate: Negotiate oYHOMIHLoAMKAQGhDAYKKwYBBAGCNwICC........
WWW-Authenticate: NTLM
X-Powered-By: ASP.NET
Initial Request/Response
IIS
Request
GET http://localhost/webapp_net/ HTTP/1.1
Accept-Encoding: gzip, deflate
Host: localhost
Connection: Keep-Alive
Response
HTTP/1.1 401 Unauthorized
Cache-Control: private
Content-Type: text/html; charset=utf-8
Server: Microsoft-IIS/8.5
WWW-Authenticate: Negotiate
WWW-Authenticate: NTLM
X-Powered-By: ASP.NET
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET,POST
Date: Tue, 20 Nov 2018 21:37:24 GMT
Content-Length: 6016
Proxy-Support: Session-Based-Authentication
HttpListener
Request
GET http://localhost/appman HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-NZ
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; WebView/2.0; rv:11.0) like Gecko
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
Host: localhost
Response
HTTP/1.1 401 Unauthorized
Content-Length: 0
Server: Microsoft-HTTPAPI/2.0
WWW-Authenticate: Negotiate
WWW-Authenticate: NTLM
Date: Tue, 20 Nov 2018 21:37:18 GMT
Proxy-Support: Session-Based-Authentication
Does anyone have similar experience or explanation for this ?
Following on closely from this question SSRS IE8 JavaScript Error Invalid Character ScriptResource.axd I have done some debugging and narrowed the issue down to a gzip, deflate problem.
We have various machines with IE8 installed on them. The problem is, some installations of IE don't seem to add the Accept-Encoding: gzip, deflate to the HTTP Request header when requesting a JavaScript resource via ScriptResource.axd.
Here is the HTTP request off machine 1 (works fine):
GET http://10.x.x.x6/Reports_2/ScriptResource.axd?d=dz2_T_-skCIGFrM350LrrgpIbuyQ3hv0Po2nyTqnjMC_h2orbb8AW34-wlapNOlKQn3w_65Hv8xicNrMgbLAWsuKLkB24a0JnVTM3AD64R_ELK1K6KpCKGgYkO_evQ1uY6IeQkuEpQDrHclftKpS0G8rnJM1&t=4d63fd9d HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-GB
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)
Accept-Encoding: gzip, deflate
Proxy-Connection: Keep-Alive
Authorization: Negotiate TlRMTVNTUAADAAAAGAAYAJIAAAAYABgAqgAAABgAGABYAAAAEAAQAHAAAAASABIAgAAAABAAEADCAAAAFYKI4gYBsR0AAAAP5M9BpXhDtQyLRxQO0MslBkQARQBOAEIASQBHAEgAUwBIAEkAUgBFAGEAbAB5ADgANgA3ADcANwBEAEMAQwAwADEAOQA4ADgAOAAW1o72sWx0hAAAAAAAAAAAAAAAAAAAAAD8+dJyp0KpjG5sP9WUlmrk4FptdhpYQAEETsImSmR+ZzMapF8Z91Wv
Host: 10.x.x.x6
And here is the same request made off machine 2 (doesn't work as its returning gzipped data):
GET http://10.x.x.x6/Reports_2/ScriptResource.axd?d=dz2_T_-skCIGFrM350LrrgpIbuyQ3hv0Po2nyTqnjMC_h2orbb8AW34-wlapNOlKQn3w_65Hv8xicNrMgbLAWsuKLkB24a0JnVTM3AD64R_ELK1K6KpCKGgYkO_evQ1uY6IeQkuEpQDrHclftKpS0G8rnJM1&t=4d63fd9d HTTP/1.1
Accept: image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/x-shockwave-flash, application/x-ms-application, application/x-ms-xbap, application/vnd.ms-xpsdocument, application/xaml+xml, application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword, */*
Accept-Language: en-gb
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET4.0C; .NET4.0E; BRI/2)
Authorization: Negotiate TlRMTVNTUAADAAAAGAAYAIIAAAAYABgAmgAAABgAGABIAAAAEAAQAGAAAAASABIAcAAAAAAAAACyAAAABYKIogUBKAoAAAAPRABFAE4AQgBJAEcASABTAEgASQBSAEUAagBvAG4AOQA0ADYAMQA0AEQAQwBDADAAMQAzADUANgA2APyGLo3yOcCnAAAAAAAAAAAAAAAAAAAAABccpJT8TohKqbhq3PzWDPApr1NmEypAPg==
Connection: Keep-Alive
Pragma: no-cache
Host: 10.x.x.x6
The problem seems to be that IE is not-requesting gzipped data, but its actually getting gzipped data from the server (an then its failing because it doesn't think its gzipped).
If i manually decompress the data returned using zcat or something, i can view the returned JavaScript fine.
What would cause IE8 not to add this header onto the request ??
I have MVC application that return PDF file.
public FileStreamResult GetDocument(int id)
{
return File(stream, "application/octet-stream", documentsModel.Name);
}
I have two test server. One is private and another is public.
From private I can download document and I get:
GET /Documents/GetDocument/3576 HTTP/1.1
Accept: */*
Accept-Language: en-GB
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; BRI/2)
Accept-Encoding: gzip, deflate
Host: appserver
Connection: Keep-Alive
Cookie: ASP.NET_SessionId=vgzn4qkelxdmic3nbaqftsxd; .FidesAuthCookie=BF08E0DCAAA54D7D78AB6BD30D5ECA523C045F9B401B10693B6CE57D7D4C677C0908E24D92511DC75A487D6CAE6DD780AA8B4419A5A5D9258A4985AF6870D3AD1A0B3C01B8A620A1E14FEDDE298CCE255AE4B4C2F76D2635B8C5DF332AF19AAB; dynatree-active=3576; dynatree-focus=; dynatree-expand=496%2C603%2Cfolder_622; dynatree-select=
HTTP/1.1 200 OK
**Cache-Control: private, s-maxage=0**
Content-Type: application/octet-stream
Server: Microsoft-IIS/7.5
Set-Cookie: .FidesAuthCookie=BF08E0DCAAA54D7D78AB6BD30D5ECA523C045F9B401B10693B6CE57D7D4C677C0908E24D92511DC75A487D6CAE6DD780AA8B4419A5A5D9258A4985AF6870D3AD1A0B3C01B8A620A1E14FEDDE298CCE255AE4B4C2F76D2635B8C5DF332AF19AAB; expires=Fri, 13-Apr-2012 13:31:05 GMT; path=/
X-AspNetMvc-Version: 3.0
Content-Disposition: attachment; filename=test.pdf
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Fri, 13 Apr 2012 13:01:04 GMT
Content-Length: 49613
From my public server I get
GET /Documents/GetDocument/97 HTTP/1.1
Accept: */*
Accept-Language: en-GB
User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; BRI/2)
Accept-Encoding: gzip, deflate
Host: beta.qi-care.nl
Connection: Keep-Alive
Cookie: ASP.NET_SessionId=h3utp0bfu4zwhqysntame3we; dynatree-active=97; dynatree-focus=; dynatree-expand=4%2Cfolder_10; dynatree-select=; .FidesAuthCookie=F0DED3D98BF4115C910B0A29EC2C809902B49F15518952DFA78DDB4358B5F0C1A9EDAFB50DD0CA761B433ED68034C2539ABCCDA0C50FF5EEEE3573D3C77E550416CDB24B302C9EB831AC597040E6D255E9B582E8A29D5FC03454F2A0742ECC9DEC61070091F9A66D1C3FC7F9CA10C1B8BB9B5109CB613C98AEE32AFE5A0F8A28
HTTP/1.1 200 OK
**Cache-Control: private, no-cache="Set-Cookie", s-maxage=0**
Content-Type: application/octet-stream
Server: Microsoft-IIS/7.5
Set-Cookie: .FidesAuthCookie=F0DED3D98BF4115C910B0A29EC2C809902B49F15518952DFA78DDB4358B5F0C1A9EDAFB50DD0CA761B433ED68034C2539ABCCDA0C50FF5EEEE3573D3C77E550416CDB24B302C9EB831AC597040E6D255E9B582E8A29D5FC03454F2A0742ECC9DEC61070091F9A66D1C3FC7F9CA10C1B8BB9B5109CB613C98AEE32AFE5A0F8A28; expires=Fri, 13-Apr-2012 13:19:35 GMT; path=/
X-AspNetMvc-Version: 3.0
Content-Disposition: attachment; filename=test.pdf
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Fri, 13 Apr 2012 12:49:35 GMT
Content-Length: 49613
and I get error
http://support.microsoft.com/kb/323308
For some reason, I have from these two server, two different responses. But I found on Microsoft support that client should change registry
To resolve this issue in Internet Explorer 7 and in Internet Explorer 8, follow these steps:
Start Registry Editor.
For a per-user setting, locate the following registry key:
HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings
For a per-computer setting, locate the following registry key:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings
On the Edit menu, click Add Value.
To override the directive for HTTPS connections, add the following registry value:
"BypassSSLNoCacheCheck"=Dword:00000001
To override the directive for HTTP connections, add the following registry value:
"BypassHTTPNoCacheCheck"=Dword:00000001
Quit Registry Editor.
Microsoft
We faced this problem at work, it turned out to be a bug in Internet Explorer (in our case IE8-), that gives an error when trying to download a file in SSL (Are you in https, right?). The problem is that if the server sends to the browser an http header that disables caching, Explorer gives an error. In your case, maxage=0 is equivalent to Cache-Control: no cache.
The solution server side is that you should overwrite this header to tell IE8 to cache the response, with Cache-Control: private for example.
Be careful that some application servers (such as in our case Websphere Application Server) append automatically no-cache="Set-Cookie" when a cookie is set.
Finally, there is another solution, if applicable, that solves the problem, but it should be applied client-side on the browser:
look at Method 1:
http://support.microsoft.com/kb/2549423
Within Firefox 9 & 10 using Firebug and Live Headers,
I am seeing the websocket request/response pairs being sent across domains but with the wrong Cookie: contents.
Give two urls -
Base web page - http://www.mysite.test/mywebapp
Websocket url - http://stompeserver.mysite.test/stomp
The browser seems to be sending the cookies for the base page hostname rather any cookies associated with the secondary hostname. i.e. the JSESSIONID cookie loaded with the base web page is being echoed to the external connection.
Is this a bug or expected behavior? Nowhere have I seen how to websockets are suppose react to cookies.
IMO, this can be a really serious security violation by exposing a site's cookies to an external websocket service.
Updated to firefox 10 and still see an issue.
Below is a slightly clarified Live Headers trace of two back to back connections
The JSESSIONID and CLIENT_LOCALE cookies are copied to from 9443 the app server to 61623 the mq server.
----------------------------------------------------------
https://myapp.com:9443/server/themes/standard/public/gwt/xxstandard/images/logout-icon.png
GET https://myapp.com:9443/server/themes/standard/public/gwt/xxstandard/images/logout-icon.png HTTP/1.1
Host: myapp.com:9443
User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:10.0.1) Gecko/20100101 Firefox/10.0.1
Accept: image/png,image/*;q=0.8,*/*;q=0.5
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
Referer: https://myapp.com:9443/server/example.htm?gwt.codesvr=127.0.0.1:9997&log_level=INFO
Cookie: JSESSIONID=0000wCOpgfIsSNOz2lL22O5LOiI:-1; CLIENT_LOCALE=en_US;
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.1 200 OK
Date: Thu, 16 Feb 2012 19:02:55 GMT
Content-Type: text/plain
Last-Modified: Wed, 29 Jun 2011 20:44:11 GMT
Content-Length: 669
Content-Language: en-US
Server: WebSphere Application Server/7.0
----------------------------------------------------------
http://myapp.com:61623/stomp
GET http://myapp.com:61623/stomp HTTP/1.1
Host: myapp.com:61623
User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:10.0.1) Gecko/20100101 Firefox/10.0.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Proxy-Connection: keep-alive
Sec-WebSocket-Version: 8
Sec-WebSocket-Origin: https://myapp.com:9443
Sec-WebSocket-Key: FToA/HGiVQN3CbGOgNffMA==
Cookie: JSESSIONID=0000wCOpgfIsSNOz2lL22O5LOiI:-1; CLIENT_LOCALE=en_US;
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
Connection: Upgrade
HTTP/1.1 101 Switching Protocols
Upgrade: WebSocket
Connection: Upgrade
Sec-WebSocket-Accept: 5lqrLU4mbPiEasSn4gqOlqWvGgw=
----------------------------------------------------------
Same-origin policy and CORS doesn't apply to WebSockets.
With WS, a "origin" HTTP header is sent in the initial WS opening handshake, and for browsers, this origin header MUST contain hostname of the server that originally served the HTML/JS that opens the WS.
The WS server is then free to accept/deny.
With non-browser WS clients, the origin header may or may not be present, and may contain anything.
Cookies: it's not specified by the WS spec. See Patrick's response (Firefox WS developer) here
http://www.ietf.org/mail-archive/web/hybi/current/msg08017.html
I used config given in this for page level caching of php content. The problem is that the cached page is saving in gzip format and it's returning the same gzip content to browser.
I need the o/p like this "12:15:37 12:15:47" (Its coming for 1st time when the page is not cached) after that if request is resend it is returning ‹������34²26±24à23Œ¸¸�`Î9”��� (gzip response as I tried zcat its returning fine)
Response Headers
Server nginx/0.8.34
Date Wed, 17 Mar 2010 07:04:58 GMT
Content-Type text/html
Last-Modified Wed, 17 Mar 2010 07:04:20 GMT
Transfer-Encoding chunked
Connection keep-alive
Vary Accept-Encoding
Expires Wed, 17 Mar 2010 07:04:58 GMT
Cache-Control max-age=0
Content-Encoding gzip
Request Headers
Host localhost
User-Agent Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.18) Gecko/2010021501
Ubuntu/9.04 (jaunty) Firefox/3.0.18 GTB6
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language en-us,en;q=0.5
Accept-Encoding gzip,deflate
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive 300
Connection keep-alive
Its possible you missed adding the gzip_static option which will serve gzipped content correctly.
However, I have now posted a new article based on Nginx 0.7+ which is better for use as a proxy cache. It doesn't need that option anymore.
http://www.webtatic.com/blog/2010/04/page-level-caching-with-nginx/