When using Google PageSpeed tester, I get the following:
The following resources have no character set specified in their HTTP headers. Specifying a character set in HTTP headers can speed up browser rendering.
http://www.ntainc.com/
I have searched everywhere, and can't seem to figure out why it isn't "reading" my charset. I have it listed in the html document header:
<meta charset="utf-8">
And in my web.config file:
<?xml version="1.0" encoding="UTF-8"?>
and there is an .htaccess file:
AddDefaultCharset utf-8
I believe our server is an IIS. What am I missing?
The warning is referring to the HTTP Content-Type header. It is being sent by your webserver to the client like this:
Content-Type: text/html
It needs to be sent like this instead:
Content-Type: text/html; charset=utf-8
You need to check your webserver configuration. Your .htaccess addition should be adding the charset attribute to the header, but apparently is not. So either you put it in the wrong .htaccess file, or the server is ignoring it.
The fact that your HTML contains a <meta charset> tag is irrelevant to the warning, as the HTML is just arbitrary data inside of the HTTP response body (though it does allow an HTML5-enabled client to process your UTF-8 encoded HTML correctly).
Your web.config charset is irrelevant, as it is only interpreted by your webserver, not a client.
Related
I am having an issue adding font awesome to my ASP.NET Core MVC (ASP.NET Core 2) application. I am simply trying to add the CSS library called font awesome to my MVC project. I have tried two approaches
1) Adding the font awesome CDN like so
<link rel="stylesheet" href="//maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css">
but when I add the CDN I get the CSP errors in Chrome
Refused to load the stylesheet
'http://maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css'
because it violates the following Content Security Policy directive:
"default-src 'self'". Note that 'style-src' was not explicitly set, so
'default-src' is used as a fallback.
So I tried adding the correct meta tags. I tried MANY combinations and nothing seemed to work. For example,
<meta http-equiv="Content-Security-Policy"
content="script-src 'self' http://maxcdn.bootstrapcdn.com
'unsafe-inline' 'unsafe-eval';
style-src 'self' http://maxcdn.bootstrapcdn.com
'unsafe-inline' 'unsafe-eval'; " />
I was still getting errors related to CSP in Chrome.
2) The second approach I took was to add the font awesome CSS file in my project. I did this and then added the corresponding reference like so:
<link rel="stylesheet" href="~/css/font-awesome.min.css">
When I did this I got the following errors despite the file being in the correct location and being referenced correctly:
GET http://localhost:5000/fonts/fontawesome-webfont.woff2?v=4.7.0 net::ERR_ABORTED
GET http://localhost:5000/fonts/fontawesome-webfont.woff?v=4.7.0 net::ERR_ABORTED
GET http://localhost:5000/fonts/fontawesome-webfont.ttf?v=4.7.0 404 (Not Found)
I looked into this issue and found that it could be related to the static file handler. I then modified the app.UseStaticFiles() to take an options parameter like this:
StaticFileOptions staticFileOptions = new StaticFileOptions();
FileExtensionContentTypeProvider typeProvider = new FileExtensionContentTypeProvider();
if (!typeProvider.Mappings.ContainsKey(".woff2"))
{
typeProvider.Mappings.Add(".woff2", "application/font-woff2");
}
if (!typeProvider.Mappings.ContainsKey(".woff"))
{
typeProvider.Mappings.Add(".woff", "application/font-woff");
}
if (!typeProvider.Mappings.ContainsKey(".ttf"))
{
typeProvider.Mappings.Add(".ttf", "application/font-ttf");
}
staticFileOptions.ContentTypeProvider = typeProvider;
app.UseStaticFiles(staticFileOptions);
But I still got the error above.
Does anyone know what I am doing wrong? I can add font awesome through its CDN or add the font awesome CSS file in my application if need be.
That policy quoted in the error message in the question has default-src 'self' but the policy shown from your meta element doesn’t. That seems to indicate your document’s being served with a policy in a Content-Security-Policy HTTP header in addition to the one in the meta.
And that other policy is relatively strict in that it has default-src 'self' and no style-src. So while you’re specifying another less-strict policy using that meta, the problem’s that the way CSP works when you specify multiple policies is, the most-strict policy always wins. So your browser’s basically ignoring your meta policy and just using the policy specified in the HTTP header.
The solution is: find the place in the server code which is adding that Content-Security-Policy HTTP header, and either change it so it has the exact policy you want, or else remove that part of the server code altogether, and instead just set the policy using the meta element.
I am building a web application, and I have to handle international characters (with stuff like "J'ai surveillé des élève à la rôtule"). Some of the data is in arbitrary static text file that are in an arbitrary directory on the file system. Those files are all utf-8 (thanks standardization!)
To serve this data, I am using an embedded jetty, with the ResourceHandler handler. I don't have any web.xml file. In addition to the static, I have a bunch of restful API that get handled trough servlet.
Problem is, Jetty ResourceHandler class doesn't seem to send a charset along with the static file Content-Type . If I request index.html, Content-Type is text/html. To correctly handle accentuated character, I would like for it to be Content-Type: text/html; charset=utf-8
For file that have a default charset of utf-8 like text/html or text/css, this is fine, but some text file dont have one and get wrongly interpreted as Windows-1252 and the accented character get garbled (I just got a Québec Liquor Store back, instead of Québec Liquor Store). Is there a way to specify a default character set and tell jetty to always send it? Something like apache AddDefaultCharset utf-8
Hardcoding everything to UTF-8 is wrong.
How about just specifying the extension to mime-type mapping for those files you want to control?
MimeTypes mimeTypes = resourceHandler.getMimeTypes();
mimeTypes.addMimeMapping("txt", "text/plain; charset=UTF-8");
I'm writing a web server in C#, just for the fun of it, and I am able to serve basic text files to my browser. However, when serving up an image (say, image.png), all browsers that I test my server on (IE, Firefox, and Chrome) show some kind of placeholder thumbnail for the image, as if the image is corrupted or invalid.
The response that I am sending to the browser looks like
HTTP/1.0 200 Ok
Content-Type: image/png
Content-Length: 14580053
{image data here}
Am I using the correct HTTP headers? Or, if I am, why else would browsers not accept the image?
Ah, figured it out... my code forgot to add an extra \n before the response body. It wasn't a problem with the headers at all, just incorrect response syntax.
Why is W3C telling me I have no doctype set for my pages (in particular the home page)? My home page is using the 1column.phtml, which has a valid doctype (see below), and you can see it when you view the source in the browser. The w3c markup validation service is telling me there is no set doctype, why?!
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
Our development site is showing a valid doctype.. and I've even tried taking the 1column.phtml (root) template from there and replace the live one and still nothing. I'm at a total loss of what the issue is.
The first 8 lines of the home page as is being sent to the validator precede the doctype and say:
Strict Standards: Aitoc_Aitpagecache_Mobile_Detect::__construct() [aitoc-aitpagecache-mobile-detect.--construct]: It is not safe to rely on the system's timezone settings. Please use the date.timezone setting, the TZ environment variable or the date_default_timezone_set() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'America/New_York' for 'EDT/-4.0/DST' instead in /home/goorins/public_html/lib/Aitpagecache/Mobile/Detect.php on line 42
Notice: Undefined index: HTTP_ACCEPT in /home/goorins/public_html/lib/Aitpagecache/Mobile/Detect.php on line 42
Strict Standards: setcookie() [function.setcookie]: It is not safe to rely on the system's timezone settings. Please use the date.timezone setting, the TZ environment variable or the date_default_timezone_set() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'America/New_York' for 'EDT/-4.0/DST' instead in /home/goorins/public_html/lib/Aitpagecache/Mainpage.php on line 172
Warning: Cannot modify header information - headers already sent by (output started at /home/goorins/public_html/lib/Aitpagecache/Mobile/Detect.php:42) in /home/goorins/public_html/lib/Aitpagecache/Mainpage.php on line 172
Which appears to be coming from an AITOC magento plug-in. Means little to me (PHP/apache/magento is not my thing), but it looks to be that Mobile/Detect.php line 42 assumes that there will be an HTTP "accept" header to process. The HTML validator does not send a HTTP "accept" header, so an error is occuring which is reported at the top of the output page. It may be that because it is reporting the error, it also reports the warnings about a misconfiguration of the timezone settings.
Hard to say for certain without an in-depth debugging session, but my guess is it's the lack of any character encoding being sent back with your browser headers.
$ curl -I https://www.goorin.com/
HTTP/1.1 200 OK
Date: Tue, 23 Oct 2012 01:04:50 GMT
Server: LiteSpeed
Connection: close
Set-Cookie: frontend=7dcc17b985ecd8983ff6ade10e0f6f2c; expires=Tue, 23-Oct-2012 02:04:50 GMT; path=/; domain=..www.goorin.com; httponly
Set-Cookie: frontend=7dcc17b985ecd8983ff6ade10e0f6f2c; expires=Tue, 23-Oct-2012 02:04:50 GMT; path=/; domain=..www.goorin.com; httponly
Content-Type: text/html
which somehow causes the character encoding to get munged, and the validator no longer recognizes it.
Try downloading your home page with curl
curl -I https://www.goorin.com/ > home.html
and then using the W3C file upload validation service (the "Validate by File Upload" tab). When I did this, the validator stopped complaining about your DOCTYPE.
So, even if it's not the lack of character encoding in your headers, this points to the problem being the delivery of the HTML document from your server to the validator service.
One of the request parameters in an http request made by the client contains Japanese characters. If I make this request in Firefox and look at the parameter as soon as it reaches the server by debugging in Eclipse, the characters look fine. If I do the same request using IE 8, the characters get garbled when I look at them at the same point in the server code (they are fine in both browsers, though). I have examined the POST requests made by both browsers, and they both pass the same sequence of characters, which is:
%2C%E3%81%9D%E3%81%AE%E4%BB%96
I am therefore thinking that this has to do with the encoding. If I look at the HTTP headers of the request, I notice the following differences. In IE:
Content-Type: application/x-www-form-urlencoded
Accept: */*
In Firefox:
Content-Type application/x-www-form-urlencoded; charset=UTF-8
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
I'm thinking that the IE 8 header doesn't state the UTF-8 encoding explicitly, even though it's specified in the meta tag of the HTML document. I am not sure if this is the problem. I would appreciate any help, and please do let me know if you need more information.
Make sure the page that contains the form has UTF-8 as charset. In IE's case, the best thing to make sure of this is by sending a HTTP header ('Content-Type: text/html; charset=utf-8') and adding a meta http-equiv tag with the content type/charset to your html (I've seen this actually matter, even when the appropriate header was sent).
Second, your form can also specify the content type:
<form enctype="application/x-www-form-urlencoded; charset=utf-8>