Posts from Thursday, May 22nd, 2008

Characteristic Confusion

Published 15 years, 9 months past

In the course of building my line-height: normal test page, I settled on defaulting to an unusual but still pervasive font family: Webdings.  The idea was that if you picked a font family in the dropdown and you didn’t have it installed, you’d fall back to Webdings and it would be really obvious that it had happened.

(A screenshot of the symbols expected from Webdings: an ear, a circle with a line through the middle, and a spider.)

Except in Firefox 3b5, there were no dings, web or otherwise.  Instead, some serif-family font (probably my default serif, Times) was being used to display the text “Oy!”.

It’s a beta, I thought with a mental shrug, and moved on.  When I made mention of it in my post on the subject, I did so mainly so I didn’t get sixteen people commenting “No Webdings in Firefox 3 betas!” when I already knew that.

So I didn’t get any of those comments.  Instead, Smokey Ardisson posted that what Firefox 3 was doing with my text was correct.  Even though the declared fallback font was Webdings, I shouldn’t expect to see it being used, because Firefox was doing the proper Unicode thing and finding me a font that had the character glyphs I’d requested.

Wow.  Ignoring a font-family declaration is kosher?  Really?

Well, yes.  It’s been happening ever since the CSS font rules were first implemented.  In fact, it’s the basis of the whole list-of-alternatives syntax for font-family.  You might’ve thought that CSS says browsers should look to see if a requested family is available and then if not look at the next one on the list, and then goes to render text.  And it does, but it says they should do that on a per-character basis.

That is, if you ask for a character and the primary font face doesn’t have it, the browser goes to the next family in your list looking for a substitute.  It keeps doing that until it finds the character you wanted, either in your list of preferred families or else in the browser’s default fonts.  And if the browser just can’t find the needed symbol anywhere at all, you get an empty box or a question mark or some other symbol that means “FAIL” in font-rendering terms.

A commonly-cited case for this is specifying a CJKV character in a page and then trying to display it on a computer that doesn’t have non-Romance language fonts installed.  The same would hold true for any page with any characters that the installed fonts can’t support.  But think about it: if you browse to a page written in, say, Arabic, and your user style sheet says that all elements’ text should be rendered in New Century Schoolbook, what will happen?  If you have fonts that support Arabic text, you’re going to see Arabic, not New Century Schoolbook.  If you don’t, then you’re going to see a whole lot of “I can’t render that” symbols.  (Though I don’t know what font those symbols will be in.  Maybe New Century Schoolbook?  Man, I miss that font.)

So: when I built my test, I typed “Oy!” for the example text, and then wrote styles to use Webdings to display that text.  Here’s how I represented that, mentally: the same as if I’d opened up a text editor like, oh, MS Word 5.1a; typed “Oy!”; selected that text; and then dropped down the “Font” menu and picked “Webdings”.

But here’s how Firefox 3 dealt with it: I asked for the character glpyhs “O”, “y”, and “!”; I asked for a specific font family to display that text; the requested font family doesn’t contain those glyphs or anything like them; the CSS font substitution rules kicked in and the browser picked those glyphs out of the best alternative.  (In this case, the browser’s default fonts.)

In other words, Firefox 3 will not show me the ear-Death Star-spider combo unless I put those exact symbols into my document source, or at least Unicode references that call for those symbols.  Because that’s what happens in a Unicode world: you get the glyphs you requested, even if you thought you were requesting something else.

The problem, of course, is that we don’t live in a Unicode world—not yet.  If we did, I wouldn’t keep seeing line noise on every web page where someone wrote their post in Word with smart quotes turned on and then just did a straight copy-and-paste into their CMS.  (Here we have a screenshot of text where a bullet symbol has been mangled into an a-rhone and an American cents sign, thus visually turning 'Wall-E' into 'Wallace'.)  Ged knows I would love to be in a Unicode world, or indeed any world where such character-incompatibility idiocy was a thing of the past.  The fact that we still have those problems in 2008 almost smacks of willful malignance on the part of somebody.

Furthermore, in most (but not all) of the text editors I have available and tested, I can type “Oy!” with the font set to Webdings and get the ear, Death Star, and spider symbols.  So mentally, it’s very hard to separate those glyphs from the keyboard characters I type, which makes it very hard to just accept that what Firefox 3 is doing is correct.  Instinctively, it feels flat-out wrong.  I can trace the process intellectually, sure, but that doesn’t mean it has to make sense to me.  I expect a lot of people are going to have similar reactions.

Having gone through all that, it’s worth asking: which is less correct?  Text editors for letting me turn “Oy!” into the ear-Death Star-spider combo, or Firefox for its rigid glyph substitution?  I’m guessing that the answer depends entirely on which side of the Unicode fence you happen to stand.  For those of us who didn’t know there was a fence, there’s a bit of a feeling of a slip-and-fall right smack onto it, and that’s going to hurt no matter who you are.


Browse the Archive