I just loaded a new system and everything looked fine. Then all of the sudden Chrome starting displaying certain fonts on certain web pages with what looks like no anti-aliasing. I've had this happen before on another system as well. Same thing, everything looked fine and then all of the sudden this started. Any ideas or suggestions? Thanks.
Screenshot of what I'm typically seeing.
http://www.denkers.com/test/font.jpg
First, make sure your text encoding is correctly set to Unicode (UTF-8) by going to the wrench icon -> Tools -> Encoding
If that doesn't work, go to the wrench -> About Google Chrome and update to the latest version.
Finally, if none of these work, try reinstalling Google Chrome.
Related
I ran into a problem with font rendering on Windows.
I'm used to a little difference in rendering between Mac and Windows, but this just made my mouth fall open. I tested the site thoroughly on Mac and I'm positive it looks just fine in Chrome, Firefox and Safari.
It looks like this on Mac browsers:
On Windows, it looks completely messed up in any browser (I tested Chrome, Firefox and IE):
I know Mac has Iowan Old Style installed by default, so I tried forcing the Mac browsers to use the webfont I generated using FontSquirrel, but that doesn't reproduce the problem on Mac.
Both browsers seem to load the same font (namely the woff version) correctly. Does anybody have any idea what this could be?
I can't post the link to the website because I don't have enough reputation, please look at the screenshots for the URL..
Thanks guys!
After some more research I found out the original (ttf) font worked perfectly fine on Windows, so it had to be FontSquirrel that caused the problems. I tried out 8 different types of settings on FontSquirrel and kept having the same issues.
After a while I decided to try a different generator and I came across Fontie: https://fontie.flowyapps.com/home
This actually solved the problem for me!
Generally speaking my font works fine but I'm seeing on odd occasions such as switching back to the browser from another application and sometimes when switching back from another tabs that my font seems to have unloaded and is using the fallback font.
I'm struggling to consistently replicate this and when I have seen it there are no errors.
I have also seen this happen in Chrome and Firefox on windows and osx and at a bit of a loss as to how this can be happening.
I haven't seen this issue in Firefox personally, but it is a known bug in Chrome.
https://code.google.com/p/chromium/issues/detail?id=336170
This is a really weird problem that appeared in the recent version of Chrome.
I have a huge app that loads hundreds of stylesheets (in dev mode). When the page loads, obviously all styles are applied but background images are missing!
If I just do nothing and wait, suddenly the images start loading randomly...
Using dev tools I checked the network tab to see if the images are requested.. but no, just a few of them appear in contrast to the previous version of Chrome.
Does anyone know if any kind of optimization has been added in Chrome that makes images load lazily? Obviously that implementation is buggy and does not consider a page with a lot of stylesheets!
This problem does not affect the app in production, where all the stylesheets are packed and reduced to just ~10.
Tested on Linux and Windows 7
I had a similar problem with our web site on Chrome 27.0.1453.93 and 27.0.1453.94. It turns out that Chrome seemed to think that all of our .gif images were corrupt. They wouldn't render in Chrome but they would render fine in IE, Firefox, and older Chrome versions.
I'm not sure what the underlying issue was, but I opened the images in Photoshop and re-saved them and now it works fine.
I'm familiar with the differences in rendering web fonts in different browsers and/or OS. A couple of questions though:
I use a web font (woff) that looks like crap in Chrome but is OK in FF (on Windows 7). The other day I used my office computer from home via remote desktop. I noticed that the font now looked like crap in FF too. It looked much the same as in Chrome at the office. (I didn't test Chrome at home). I know that remote desktop reduces "the graphics" somehow, but not exactly how, and I have no idea how it could effect font rendering. When I came to the office the day after, the rendering in FF was still messed up. I guess the remote desktop sessionĀ“s changes to "the graphics" was still in effect. I checked with Chrome and now rendering in that browser looks fine, like in FF before!!? So I restarted the computer to get back my usual "graphics settings" but that didn't help. Then I cleared the font cache and restarted again. Now I'm back to crappy Chrome rendering and OK FF rendering.
My questions:
What is happening with "the graphics" in general, and with font rendering in particular, when I connect with remote desktop (setting = 32-bit color depth)? My guess is that whatever changes, it gets both FF and Chrome to use another rendering method than before.
How can the effect still be there after rebooting the computer. Is the "rendering result" somehow stored in the font cache as it seems?? Seems odd.
Thanks for any advice.
Chrome cannot render TrueType fonts with correct anti-aliasing at the moment. WOFF fonts are containers for either OpenType or TrueType, in your case probably TrueType, so you get the crappy rendering. You can either serve an SVG font to Chrome (bigger file size) or use a WOFF based on OpenType.
Apart from that, many other factors influence font rendering, like having the font locally on your system or having ClearType enabled or not (which is not enabled by default through remote desktop).
See here as well.
I had the same issue and searched but found no answers.
Ultimately in my case it was a combination of remote desktop, server 2012 and browser fonts (Roboto in my case).
It was the worst in chrome, ok in firefox, perfect in ie.
The cause was the missing feature 'desktop experience' in server 2012.
To add this feature to server 2012:
Click Start, point to Administrative Tools, and then click Server Manager. In Server Manager, click Features, and then in the Server Manager details pane, under Features Summary, click Add features. In the Features list, select Desktop Experience, and then click Install.
That completely fixed the issue for me in all browsers/fonts, hope it helps someone else out there.
I'm having troubles testing a StageVideo file locally. The HTML file that contains the swf loads perfectly in Chrome & FireFox, but when I go to open it in Internet Explorer 9 nothing happens (it just shows a white screen).
I've added the permissions for the location to the 'Global Security Settings' tab of the Flash settings manager (on the Macromedia website). I've also checked in the IE9 settings to make sure that it allows GPU rendering.
Has anyone encountered anything similar or have any suggestions as to why it might be blocked in IE9?
Thanks in advance.
My gut feeling is that you don't have the latest Flash Player for IE, but do have it for Chrome/FF since they're 2 different versions. The swf probably doesn't even load up because of the Flash Player requirement.