I'm working on a project in Japanese language. Part of the page is loading via ajax. Everything in the page is nice and clean but the part that load via ajax cannot render the CodePage and CharSet. Im working with ASP Classic and I added the header to the source page like this:
Response.ContentType = "text/html"
Response.AddHeader "Content-Type", "text/html;charset=UTF-8"
Response.AddHeader "lang", "ja"
Response.CodePage = 65001
Response.CharSet = "UTF-8"
When I add this header all the characters that load from server are fine but the local text area scrambled (A). When I remove the header all the local characters are fine and the server sides are scrambled(B)
Any idea how can I solv this issue? Is there any way I can do this trough ajax?
I already tried contentType: "application/x-www-form-urlencoded;charset=UTF-8", but it seems not working.
Thanks in advance.
Do you set the same encoding and charset on both pages? Is the problem consistent across all browsers?
I also think its recommended to set the charset inside the <HEAD> tag as well
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
Related
When I post link to my article on Facebook, then Facebook loads a part of site, so I can see some text from this article, but I get "poniedziaÅek", but should be "poniedziałek" it just doesn't show polish characters. I have set up
<meta content="text/html; charset=utf-8" http-equiv="content-type">
but nothing, still have some weird characters.
Try experimenting with unicode alias settings in joomla global configuration.
Also take a look at these:
Problem with facebook and polish characters in links
Non ASCII-7 characters in URL (article alias)
Unicode urls
I'm having trouble displaying latin1 characters such as "ç", "ã" or "À" in the latest versions of Safari and Opera. I receive data (JSON) from a RoR backend using Ajax and JQuery (Latin1 charset) and the webpage itself relies on Latin1, thanks to:
<?php header('Content-Type: text/html; charset=ISO-8859-1');?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml"
xmlns:og="http://ogp.me/ns#"
xmlns:fb="http://www.facebook.com/2008/fbml"
lang="pt">
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
The custom Javascript lib i made also specifically states ISO-8859-1 when I perform the include some ten lines later on:
<script type="text/javascript" src="js/lib.js" charset="ISO-8859-1"></script>
Nevertheless, both browsers fail to display the characters afterwards. Safari shows the infamous black diamond, while Opera simply shows a blank space.
Any ideas? Thanks in advance
Most likely wrong charset sent in your Content-type: HTTP header for the JSON data. In your post you show the headers and META tags for the page itself and the included SCRIPT, but assuming the JSON data is sent separate it will be labelled separately. It would help to get a link to a page with this problem, but if you don't want to post one you can use a tool like Microsoft Fiddler HTTP debugger to inspect the headers that are being sent back and forth between the browser and the web site. If the web server sends
Content-type: text/html;charset=UTF-8
for a file with content in "latin" (iso-8859-1) or vice versa, that's your problem. Fix the HTTP header and you'll be fine.
I've got some markup that I'm adding to a page component in Day CQ that was UTF-8 encoded by the author. Initially I couldn't save it in CRXDE, b/c the editor was set to save in ISO-8859-1. I found the setting to change this, but now when the page using this component is rendered to the browser, some of the characters appear to be using a different encoding. Is there a setting for the CQ web server, or servlet engine that I need to change? I'm running CQ 5.3 on Windows 7.
Edit: The HTTP Headers have Content-Type: text/html;charset=UTF-8 and there is a meta tag that specifies meta http-equiv="Content-type" content="text/html; charset=utf-8"
I believe the solution was to add pageEncoding="UTF-8" to all JSP's that are part of rendering this page. I also modified the web.xml file per this link: http://www.coderanch.com/t/87264/Tomcat/Character-Encoding-Tomcat, and restarted the server a number of times.
I am trying to display the japanese characters in my page. The page is working in all browsers except IE6. I noticed some sites http://translation.babylon.com/english/to-japanese/ display japanese characters as boxes. As i said earlier the page is working in all browsers except IE6.
The header i am using in the page is
!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"
and UTF-8 encoding
Could you please help to find out what is the issue.
Thank you
Usually content developer has to write right meta-tag for correct character decoding. Like this.
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
If there is no meta tag on the content, the browser has to decode the page by own auto decoding method. But auto decoding is not perfect. Sometime it works, sometime it doesn't work.
I am working on a website and when I try to validate the page get the following error:
The character encoding specified in
the HTTP header (iso-8859-1) is
different from the value in the
element (utf-8). I will use the value
from the HTTP header (iso-8859-1) for
this validation.
Here is the code in my header:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-type" content="text/html;charset=UTF-8"/>
I don't see where the iso-8859-1 is coming from. Any suggestions?
Thanks.
It's the webserver which specify the encoding in the HTTP header. It set it to iso-8859-1. But in your page, you wrote:
<meta http-equiv="Content-type" content="text/html;charset=UTF-8"/>
these two values are incompatible. I can only suppose that the webserver is right (it has sent the data anyway), and the validator makes the same assumption.
If you want to send UTF-8 encoded files, check that the content is really UTF8 encoded, and check the header informations. Ultimately the behavior depends on the webserver configuration and page generation.
That's the header of your HTML file, not the HTTP headers the server is sending. The meta element defines equivalents to HTTP headers. If both an HTTP header is sent and a meta element exists with the equivalent, the user agent must decide which to use. It might work in your browser but it seems the validator your are using gives precedence to the actual HTTP header.
So you have to figure out how to make your server send the correct Content-type header. If your page is generated by a PHP script you can use header('Content-type:text/html;charset=UTF-8'); at the beginning of your script to fix it.
Check the default HTTP headers that are sent (you can see this in firebug in the NET tab, if you use it).
There is probably a Content-Type header set to iso-8859-1.
HTTP headers are is different from the HTML header (which is part of the body of the HTTP message) - where your META tag specifies UTF-8 as the content type.
Since the two values are incompatible, you are getting an error.
Solution:
Make both content-types identical (either UTF-8, or iso-8859-1)