I have used this URL for the web API compression but when I see the out put in the fiddler header is not zip. there are multiple Zip option are available example GZIP, BZIP2 DEFLATE not sure which one to use kindly help here
I have tried with the below solution and both of them are not working :
http://benfoster.io/blog/aspnet-web-api-compression
there are multiple Zip option are available example GZIP, BZIP2 DEFLATE not sure which one to use kindly help here
This list will be sent to the server and let it know about the client side preferences about compression. It means "I first prefer GZIP. If GZIP not supported by the server side then fallback to BZIP2 DEFLATE compression. If BZIP2 DEFLATE not supported then the server will not do any compression."
There is someone who already create a nuget package that use that implementation you just put in your question. The package name is Microsoft.AspNet.WebApi.MessageHandlers.Compression which install the following two packages :
Microsoft.AspNet.WebApi.Extensions.Compression.Server
System.Net.Http.Extensions.Compression.Client
If you don't need the client side library then just just the server side package in your Web API project.
To use it you need to modify to add the following line at the end of your Application_Start method in Gloabl.asax.cs:
GlobalConfiguration.Configuration.MessageHandlers.Insert(0, new ServerCompressionHandler(new GZipCompressor(), new DeflateCompressor()));
To learn more about this package check this link.
Related
How can I find out what part of my Firefox installation modifies my HTTP headers?
Using a tool that displays my headers for the corresponding request I can see my headers contain the following string:
Accept-Language: de,ar-SA;q=0.8,en-US;q=0.5,en;q=0.3
I want to find out how ar-SA got in there.
I don't know how you'd go about seeing what changed your HTTP headers specifically, but I do know where you set your preferred languages. In Preferences, go to the Language section:
And then click on "Choose..." to see them:
I added "Dutch [nl]" to mine to see if my headers would change and sure enough they did:
Accept-Language: nl,en-GB;q=0.7,en;q=0.3
I want to set up new project in Yii 2.0.6 framework, which I will use for simple REST calls only (making requests and getting responses from DB).
I have downloaded the framework from their official site (basic pack from Archive file). Now initially I have empty project that initially take place of ~24MB. I'm not sure if this is a good idea, because every time I will make some request (from mobile devices), it will probably load all these 24MB from the server. Is it how it works?
Is there a way for setting up the Yii 2.0.6 project with minimal size on the disk? I really need everything to be optimized and to load as minimal code as possible.
PHP files will only be rendered in the server side where only required files will be loaded and used and all what your mobile is going to receive are the Json outputs if what you are going to build is a REST api.
To check the size in byte of a json file that your mobile is going to receive, you can use curl as described in Yii2 REST docs and add grep Content-Length to the request :
curl -i -H "Accept:application/json" "http://YOUR_RESOURCES_URL" | grep Content-Length
or you can use the network tab of a navigator dev tools instead :
here the json file's size is 392B and it took 179 ms to receive it from server
Also remember that by default Yii is installed with dev environment settings, there is optimizations to do before publishing the product in order to optimize time responses. check this for more details.
It is also good practice to use a tool like gzip to compress data before sending to mobile as described here : JSON REST Service: Content-Encoding: gzip
Of course not. Yii only load the required classes (and files) on the fly, thanks to the autoloading mechanism.
A few components are preloaded during the bootstrap phase (and you can add or remove some of them in the configuration file). Other classes won't clutter your memory as long as you don't use them.
I have .gz files stored on AWS s3.
Using the s3 REST-API, I'm generating authenticated links that point to individual files. I'm also setting the content-header options such that browsers requesting these urls will decompress and download the gzipped files as attachments.
The generated s3 url looks like so:
https://MY_BUCKET.s3.amazonaws.com/PATH_TO/file.ext.gz
?AWSAccessKeyId=MY_KEY
&Expires=DATE_TIME
&Signature=MY_SIGNATURE
&response-content-disposition=attachment%3B%20filename%3D%22file.ext%22
&response-content-encoding=gzip
&response-content-type=application%2Foctet-stream
&x-amz-security-token=MY_TOKEN
The links behave as expected in: (All on OSX) Chrome (42.0.2311), Safari (8.0.6), Opera (29.0),
but NOT Firefox (38.0.1)
Firefox downloads and renames the file correctly but fails to decompress the gzipped file.
The response headers of a GET request to the authenticated urls look like so:
Accept-Ranges:bytes
Content-Disposition:attachment; filename="file.ext"
Content-Encoding:gzip
Content-Length:928
Content-Type:application/octet-stream
Date:SOME_DATE_TIME
ETag:"MY_ETAG"
Last-Modified:SOME_OTHER_DATE_TIME
Server:AmazonS3
x-amz-expiration:expiry-date="ANOTHER_DATE_TIME"
x-amz-id-2:MY_AMZ_ID
x-amz-request-id:MY_AMZ_REQUEST_ID
x-amz-server-side-encryption:AES256
Does Firefox look for different headers and/or header values to indicate decompression?
The solution appears to be removing .gz from the end of the filename.
It's a common misconfiguration to set Content-Encoding: gzip on .gz files when you intend for the end user to download -- and end up with -- a .gz file; e.g. downloading a .tar.gz of source package.
This isn't what you are doing... It's the opposite, essentially... but I suspect you're seeing a symptom of an attempt to address that issue.
In fact, the configuration I described should only be the case when you gzipped an already-gzipped file (which, of course, you shouldn't do)... but it was entrenched for a long time by (iirc) default Apache web server configurations. Old bug reports seem to suggest that the Firefox developers had a hard time grasping what should be done with Content-Encoding: gzip, particularly with regard to downloads. They were a bit obsessed, it seems, with the thought that the browser should not undo the content encoding when saving to disk, since saving to disk wasn't the same as "rendering" the downloaded content. That, to me, is nonsense, a too-literal interpretation of an RFC.
I suspect what you see is a legacy of that old issue.
Contrary to your conception, it's quite correct to store a file with Content-Encoding: gzip without a .gz extension... arguably, in fact, it's more correct to store such content without a .gz extension, because the .gz implies (at least to Firefox, apparently) that the downloading user should want the compressed content downloaded and saved in the compressed form.
1. Background to compressed content
Michael's changing of the file extension solves the problem is because the important step is to change the Content-Type header to reflect the underlying content within the compressed file, rather than that of the compressed file itself.
In many webservers, the mime types are detected based on file extension - for example, you may have a mime type of application/gzip corresponding to .gz file extensions (on a default Debian install of nginx, this can be found within /etc/nginx/mime.types). Your server will then set headers as Content-Type: application/gzip for the files matching this mime type.
If your browser receives a Content-Type header suggesting binary compressed content is on it's way rather than the text within the compressed file, it will assume it's not for human consumption and may not display it. Mine (Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:75.0) Gecko/20100101 Firefox/75.0) didn't.
2. Header adjustment
Set a Content-Encoding: 'gzip' header
Set Content-Type: 'text/plain' header for files you want displayed in plain text
The browser (if gzip compression is supported), should decompress and display the content for the client.
3. Real world example
/usr/share/doc contains text documentation, many of which have also been gzip compressed.
By adding the following to the nginx server {} block, you can enable transparent decompression on the client:
# local documentation access
location /doc {
alias /usr/share/doc;
autoindex on; # allow dir listings
allow 127.0.0.1; deny all; # anyone outside is forbidden
# display .gz content in text on the browser
location ~ \.gz {
default_type text/plain;
add_header Content-Encoding: 'gzip';
}
}
I'm currently working on a proxy server where we in this case have to modify the data (by using regexp) that we push through it.
In most cases it works fine except for websites that use gzip as content-encoding (I think), I've come across a module called compress and tried to push the chunks that I receive through a decompress / gunzip stream but it isn't really turning out as I expected.
I was wondering if I am at all heading in the right direction, and if there are more modules out there to make my life easier (regarding gzip compression).
Greetz,
Benjamin
If you think your proxy is just for filtering or modifying text, you feel verbose about gzip compression and decompression.
Another solution is simple.
Modify http request header. Then you can get plain text from server.
Remove 'Accept-Encoding' from http request header.
See here: Node.js proxy, dealing with gzip DEcompression
For an answer that covered most of my problems.
Have a look there: Node.js: Gzip compression?
There is a alternative to using node-compress, but this solution is also mentioned.
Cheers,
-stan
How do I configure the content-types returned from mongrel. Speficially I want it to return some javascripts files as application/x-javascript to try and reproduce a bug I am seeing on a remote server
I don't know if this is exactly the answer that you are looking for but I found this by doing a quick google search. http://mongrel.rubyforge.org/wiki/HOWTO
It states that you can provide a yaml file with mime-types.