Posts in the CSS Category

JavaScript Will Save Us All

Published 16 years, 7 months past

A while back, I woke up one morning thinking, John Resig’s got some great CSS3 support in jQuery but it’s all forced into JS statements.  I should ask him if he could set things up like Dean EdwardsIE7 script so that the JS scans the author’s CSS, finds the advanced selectors, does any necessary backend juggling, and makes CSS3 selector support Transparently Just Work.  And then he could put that back into jQuery.

And then, after breakfast, I fired up my feed reader and saw Simon Willison‘s link to John Resig’s nascent Sizzle project.

I swear to Ged this is how it happened.

Personally, I can’t wait for Sizzle to be finished, because I’m absolutely going to use it and recommend its use far and wide.  As far as I’m concerned, though, it’s a first step into a larger world.

Think about it: most of the browser development work these days seems to be going into JavaScript performance.  Those engines are being overhauled and souped up and tuned and re-tuned to the point that performance is improving by orders of magnitude.  Scanning the DOM tree and doing things to it, which used to be slow and difficult, is becoming lightning-fast and easy.

So why not write JS to implement multiple background-image support in all browsers?  All that’s needed is to scan the CSS, find instances of multiple-image backgrounds, and then dynamically add divs, one per extra background image, to get the intended effect.

Just like that, you’ve used the browser’s JS to extend its CSS support.  This approach advances standards support in browsers from the ground up, instead of waiting for the browser teams to do it for us.

I suspect that not quite everything in CSS3 will be amenable to this approach, but you might be surprised.  Seems to me that you could do background sizing with some div-and-positioning tricks, and text-shadow could be supportable using a sIFR-like technique, though line breaks would be a bear to handle.  RGBa and HSLa colors could be simulated with creative element reworking and opacity, and HSL itself could be (mostly?) supported in IE with HSL-to-RGB calculations.  And so on.

There are two primary benefits here.  The first is obvious: we can stop waiting around for browser makers to give us what we want, thanks to their efforts on JS engines, and start using the advanced CSS we’ve been hearing about for years.  The second is that the process of finding out which parts of the spec work in the real world, and which fall down, will be greatly accelerated.  If it turns out nobody uses (say) background-clip, even given its availability via a CSS/JS library, then that’s worth knowing.

What I wonder is whether the W3C could be convinced that two JavaScript libraries supporting a given CSS module would constitute “interoperable implementations”, and thus allow the specification to move forward on the process track.  Or heck, what about considering a single library getting consistent support in two or more browsers as interoperable?  There’s a chance here to jump-start the entire process, front to back.

It is true that browsers without JavaScript will not get the advanced CSS effects, but older browsers don’t get our current CSS, and we use it anyway.  (Still older browsers don’t understand any CSS at all.)  It’s the same problem we’ve always faced, and everyone will face it differently.

We don’t have to restrict this to CSS, either.  As I showed with my href-anywhere demo, it’s possible to extend markup using JS.  (No, not without breaking validation: you’d need a custom DTD for that.  Hmmm.)  So it would be possible to use JS to, say, add audio and video support to currently-available browsers, and even older browsers.  All you’d have to do is convert the HTML5 element into HTML4 elements, dynamically writing out the needed attributes and so forth.  It might not be a perfect 1:1 translation, but it would likely be serviceable—and would tear down some of the highest barriers to adoption.

There’s more to consider, as well: the ability to create our very own “standards”.  Maybe you’ve always wanted a text-shake property, which jiggles the letters up and down randomly to look like the element got shaken up a bit.  Call it -myCSS-text-shake or something else with a proper “vendor” prefix—we’re all vendors now, baby!—and go to town.  Who knows?  If a property or markup element or attribute suddenly takes off like wildfire, it might well make it into a specification.  After all, the HTML 5 Working Group is now explicitly set up to prefer things which are implemented over things that are not.  Perhaps the CSS Working Group would move in a similar direction, given a world where we were all experimenting with our own ideas and seeing the best ideas gain widespread adoption.

In the end, as I said in Chicago last week, the triumph of standards (specifically, the DOM standard) will permit us to push standards support forward now, and save some standards that are currently dying on the vine.  All we have to do now is start pushing.  Sizzle is a start.  Who will take the next step, and the step after that?


Caught In The Camera Eye

Published 17 years, 5 days past

Just when you thought the whole embedded-video thing couldn’t get any worse, here I come with videos featuring, well, me.

The most recent is a short clip from one of my presentations at An Event Apart back in April, debug / reboot, where I comment at my usual pace on the suppression of quotation marks in my reset styles and why I think relying on browser-generated quotation marks is a bad idea.  You also get to see my hair before it got to be the length it is now, which is even longer.  There’s a complete transcription on that page, by the way, courtesy Mr. Z.

Then there’s the vaguely silly one, in which I attempt to debug my clothing while sitting in my living room.  The main takeaway here, I think, is that my speech patterns on stage are just about the same as those in “regular life”.  Pity my family.

So there’s me in the movies.  It’s nowhere near as epic-ly mëtäl as some other folks’ videos, but I suppose we all do what we can.


Characteristic Confusion

Published 17 years, 3 weeks past

In the course of building my line-height: normal test page, I settled on defaulting to an unusual but still pervasive font family: Webdings.  The idea was that if you picked a font family in the dropdown and you didn’t have it installed, you’d fall back to Webdings and it would be really obvious that it had happened.

(A screenshot of the symbols expected from Webdings: an ear, a circle with a line through the middle, and a spider.)

Except in Firefox 3b5, there were no dings, web or otherwise.  Instead, some serif-family font (probably my default serif, Times) was being used to display the text “Oy!”.

It’s a beta, I thought with a mental shrug, and moved on.  When I made mention of it in my post on the subject, I did so mainly so I didn’t get sixteen people commenting “No Webdings in Firefox 3 betas!” when I already knew that.

So I didn’t get any of those comments.  Instead, Smokey Ardisson posted that what Firefox 3 was doing with my text was correct.  Even though the declared fallback font was Webdings, I shouldn’t expect to see it being used, because Firefox was doing the proper Unicode thing and finding me a font that had the character glyphs I’d requested.

Wow.  Ignoring a font-family declaration is kosher?  Really?

Well, yes.  It’s been happening ever since the CSS font rules were first implemented.  In fact, it’s the basis of the whole list-of-alternatives syntax for font-family.  You might’ve thought that CSS says browsers should look to see if a requested family is available and then if not look at the next one on the list, and then goes to render text.  And it does, but it says they should do that on a per-character basis.

That is, if you ask for a character and the primary font face doesn’t have it, the browser goes to the next family in your list looking for a substitute.  It keeps doing that until it finds the character you wanted, either in your list of preferred families or else in the browser’s default fonts.  And if the browser just can’t find the needed symbol anywhere at all, you get an empty box or a question mark or some other symbol that means “FAIL” in font-rendering terms.

A commonly-cited case for this is specifying a CJKV character in a page and then trying to display it on a computer that doesn’t have non-Romance language fonts installed.  The same would hold true for any page with any characters that the installed fonts can’t support.  But think about it: if you browse to a page written in, say, Arabic, and your user style sheet says that all elements’ text should be rendered in New Century Schoolbook, what will happen?  If you have fonts that support Arabic text, you’re going to see Arabic, not New Century Schoolbook.  If you don’t, then you’re going to see a whole lot of “I can’t render that” symbols.  (Though I don’t know what font those symbols will be in.  Maybe New Century Schoolbook?  Man, I miss that font.)

So: when I built my test, I typed “Oy!” for the example text, and then wrote styles to use Webdings to display that text.  Here’s how I represented that, mentally: the same as if I’d opened up a text editor like, oh, MS Word 5.1a; typed “Oy!”; selected that text; and then dropped down the “Font” menu and picked “Webdings”.

But here’s how Firefox 3 dealt with it: I asked for the character glpyhs “O”, “y”, and “!”; I asked for a specific font family to display that text; the requested font family doesn’t contain those glyphs or anything like them; the CSS font substitution rules kicked in and the browser picked those glyphs out of the best alternative.  (In this case, the browser’s default fonts.)

In other words, Firefox 3 will not show me the ear-Death Star-spider combo unless I put those exact symbols into my document source, or at least Unicode references that call for those symbols.  Because that’s what happens in a Unicode world: you get the glyphs you requested, even if you thought you were requesting something else.

The problem, of course, is that we don’t live in a Unicode world—not yet.  If we did, I wouldn’t keep seeing line noise on every web page where someone wrote their post in Word with smart quotes turned on and then just did a straight copy-and-paste into their CMS.  (Here we have a screenshot of text where a bullet symbol has been mangled into an a-rhone and an American cents sign, thus visually turning 'Wall-E' into 'Wallace'.)  Ged knows I would love to be in a Unicode world, or indeed any world where such character-incompatibility idiocy was a thing of the past.  The fact that we still have those problems in 2008 almost smacks of willful malignance on the part of somebody.

Furthermore, in most (but not all) of the text editors I have available and tested, I can type “Oy!” with the font set to Webdings and get the ear, Death Star, and spider symbols.  So mentally, it’s very hard to separate those glyphs from the keyboard characters I type, which makes it very hard to just accept that what Firefox 3 is doing is correct.  Instinctively, it feels flat-out wrong.  I can trace the process intellectually, sure, but that doesn’t mean it has to make sense to me.  I expect a lot of people are going to have similar reactions.

Having gone through all that, it’s worth asking: which is less correct?  Text editors for letting me turn “Oy!” into the ear-Death Star-spider combo, or Firefox for its rigid glyph substitution?  I’m guessing that the answer depends entirely on which side of the Unicode fence you happen to stand.  For those of us who didn’t know there was a fence, there’s a bit of a feeling of a slip-and-fall right smack onto it, and that’s going to hurt no matter who you are.


line-height: abnormal

Published 17 years, 1 month past

When I first wrote Cascading Style Sheets: The Definitive Guide, the part that caused me the most difficulty and headaches was the line layout material.  Several times I was sure I had it all figured out and accurately described, only to find out I was wrong.  For two weeks I corresponded with Ian Hickson and David Baron, arguing for my understanding of things and having them show me, in merciless detail, how I was wrong.  I doubt that I will ever stop owing them for their dedication to getting me through the wilderness of my own misunderstandings.

Later on, I produced a terse description of line layout which went through a protracted vetting process with the CSS Working Group and the members of www-style.  At the time it was published, there was no more detailed and accurate description of line layout available.  Even at that, corrections trickled in over the years, which made me think of it as my own tiny little The Art of Computer Programming.  Only without the small monetary reward for finding errors.

The point here is that line layout is very difficult to truly understand—even given everything I just said, I’m still not convinced that I do—and that there are often surprises lurking for anyone who goes looking into the far corners of how it happens.  As I’ve said before, my knowledge of what goes into the layout of lines of text imparts a sense of astonishment that any page can be successfully displayed in less than the projected age of the universe.

Why bring all this up?  Because I went and poked line-height: normal with a stick, and found it to be both squamous and rugose.  As with all driven to such madness, I now seek, grinning wildly, to infect others.

Here’s the punchline: the effects of declaring line-height: normal not only vary from browser to browser, which I had expected—in fact, quantifying those differences was the whole point—but they also vary from one font face to another, and can also vary within a given face.

I did not expect that.  At least, not consciously.

My work, let me show it to you: a JavaScript-driven test file where you can pick from a list of fonts and see what happens at a variety of sizes.  (Yes, the JS is completely obtrusive; and yes, the JS is the square of amateur hour.  Let’s move on, please.  I’m perfectly happy to replace what’s there with unobtrusive and sharper JS, as long as the basic point of the page, which is testing line-height: normal, is not compromised.  Again, moving on.)

When you first go to the test, you should (I hope) see a bunch of rulered boxes containing text using the very common font face Webdings, set at a bunch of different font sizes.  The table shows you how tall the simple line boxes are at each size, and therefore the numeric equivalent for line-height: normal at those sizes.  So if a line box is using font-size: 50px and the line box is 55 pixels tall, the numeric equivalent for line-height: normal is 1.1 (55 divided by 50).

On my PowerBook, Webdings always yields a 1:1 ratio between the font-size and line box height.  The ten-pixel font size yields a ten-pixel-tall line box, and so on.

This is actually a little surprising by itself.  The CSS 2.1 specification says:

normal
Tells user agents to set the used value to a “reasonable” value based on the font of the element. The value has the same meaning as <number>. We recommend a used value for ‘normal’ between 1.0 to 1.2. The computed value is ‘normal’.

This is basically what CSS has said since its first days (see the equivalent text in CSS1 or in CSS2 for confirmation) and there’s always been a widespread assumption that, since 1.0 is probably too crowded, something around 1.2 is much more likely.

So finding a value of 1 was a surprise.  It was an even bigger surprise to me that this held true in Camino 1.5.2, Firefox 2.0.0.14, and Safari 2.0.4, all on OS X.  Firefox 3b5 didn’t render Webdings at all, so I don’t know if it would do the same.  I actually suspect not, for reasons best left for another time (and, possibly, a final release of Firefox 3).

Various browsers doing the same thing in an under-specified area of the spec?  That can’t be right.  It’s pretty much an article of faith that given the chance to do anything differently, browsers will.  The sailing was so unexpectedly smooth that I immediately assumed was that a storm lurked just over the horizon.

Well, I was right.  All I had to do was start picking other font faces.

To start, I picked the next font on the list, Times New Roman, and the equivalent values for normal immediately changed.  In other words, the numeric equivalents for Times New Roman are different than those for Webdings.  The browsers weren’t maintaining a specific value for normal, but were altering it on a per-face basis.

Now, this is legal, given the way normal is under-specified.  There’s room to allow for this behavior.  It’s actually, once you think about it, a fairly good thing from a visual point of view: the best default line height for Times New Roman is probably not the best default line height for Courier New.  So while I was initially surprised, I got over it quickly.  The seemingly obvious conclusion was that browsers were actually respecting the fonts’ built-in metrics.  This was reinforced when I found that the results were exactly the same from browser to browser.

Then I looked more closely at the numbers, and confusion set back in.  For Times New Roman, I was getting values of 1.1, 1.12, 1.16, 1.15, 1.149, and 1.1499.  If you were to round all of those numbers to two decimal points, you’d get 1.10, 1.12, 1.16, 1.15, 1.15, 1.15.  If you round them all to one decimal place, you’d get 1.1, 1.1, 1.2, 1.2, 1.1, 1.1.  They’re inconsistent.

But wait, I thought, I’m trying to compare numbers I derived by dividing pixels by pixels.  Let’s turn it around.  If I multiply the most precise measurement I’ve gotten by the various font sizes, I get… carry the two… 11.499, 28.7475, 57.495, 114.99, 1149.9, 11499.  As compared to the actual values I got, which were 11, 28, 58, 115, 1149, and 11499.

Which means the results were inappropriately rounded up in some cases and down in others.  28.7475 became 28 and 1149.9 became 1149, whereas 57.495 became 58.  Even though 11.499 became 11 and 114.99 became 115.

This was consistent across all the browsers I was testing.  So again, I was suspecting the fonts themselves.

And then I switched from Times New Roman to just plain old Times, and the storm was full upon me.  I’ll give you the results in a table.

Derived normal equivalents for Times in OS X browsers
font-size Camino 1.5.2 Firefox 2.0.0.14 Safari 2.0.4
10 1 1.2 1.3
25 1 1 1.16
50 1 1 1.18
100 1 1 1.15
1000 1 1 1.15
10000 1 1 1.15

Much the same happened when comparing Courier New with plain old Courier: full consistency on Courier New between browsers, albeit with the same strange (non-)rounding effects as seen with Times New Roman; but inconsistency between browsers on plain Courier—with Camino yielding a flat 1 down the line, Firefox going from 1.2 to 1, and Safari having a range of values above the others’ values.

Squamous!  Not to mention rugose!

Now it’s time for the stunning conclusion that derives from all this information, which is: not here.  Sorry.  So far all I have are observations.  I may turn all this into a summary page which shows the results for all the font faces across multiple browsers and platforms, but first I’ll need to get those numbers.

I do have a few speculations, though:

  1. Firefox’s inconsistency within font faces (see Times and Courier, above) may come from face substitution.  That’s when a browser doesn’t have a given character in a given face, so it looks for a substitute in another face.  If Firefox thinks it doesn’t have 10-pixel Times, it might substitute 10-pixel something else serif-ish, and that face has different line height characteristics than Times.  I don’t know what that other face might be, since it’s not Times New Roman or Georgia, but this is one possibility.  It is not the minimum font size setting in the preferences, as I’ve triple-checked to make sure I have that set to “None”.

  2. Another possibility for Firefox’s line height weirdness is a shift from subpixel font rendering to pixelly font rendering.  10-pixel text in Firefox is distinctly pixelly compared to the other browsers I tested, while sizes above there are nice and smooth.  Why this would drive up the line height by two pixels (20%), though, is not clear to me.

  3. Much of what I’ve observed will likely be laid to rest at the doorsteps of the font faces themselves.  I’d like to know how it is that the rounding behaviors are so (mathematically) messed up within faces, though.  Perhaps ideal line heights are described as an equation rather than a simple ratio?

Again, this was all done in OS X; I’ll be very interested to find out what happens on Windows, Linux, and other operating systems.  Side note for the Mac Opera fans warming up their flamethrowers: I’ve left Opera 9.27 for OS X out of this because it seems to cap font sizes at a size well below 1000, although this limit varied from one face to another.  Webdings and Courier capped at 507 pixels, whereas Courier capped at 574 pixels and Comic Sans MS stopped at 707 pixels.  I have no explanation, though doubtless someone will, but the upshot is that direct comparisons between Opera and the other browsers are impossible.  For sizes up to 100 pixels, the results were exactly consistent with Camino, if that means anything.

The one tentative conclusion I did reach is this: line-height: normal is a jumbled terrain of inconsistent behaviors, and it’s best avoided in any sort of precision layout work.  I’d already had that feeling, but at least now there’s some evidence to back up the feeling.

In any case, I doubt this is the last I’ll have to say on this particular topic.

Update 7 May 08: I’ve updated the test page with a fix from Ben Lowery so that it works in IE.  Thanks, Ben!  Now all I need is to add a way to type in any arbitrary font-family’s name, and we’ll have something everyone can use.  (Or else a way to use JavaScript to suck up the names of all the fonts installed on a machine and put them into the dropdown.  That would be cool, too.)


Crafting Ourselves

Published 17 years, 1 month past

My referrers lit up recently due to Jonathan Snook’s article about CSS resets and how he doesn’t use them.  To Jonathan and all the doubters and nay-sayers out there, I have only one thing to say:

Good for you.

Seriously; no sarcasm or passive-aggressiveness intended.  If I thought my reset styles, or really anything I’ve ever published or advocated, was a be-all end-all ultimate solution for every designer and design that’s ever been and could ever be, I’d be long past due for six rounds on the receiving end of a clue-by-four.

Reset styles clearly work for a lot of people, whether as-is or in a modified form.  As I say on the reset page, those styles aren’t supposed to be left alone by anyone.  They’re a starting point.  If a thousand people took them and created a thousand different personalized style sheets, that would be right on the money.  But there’s also nothing wrong with taking them and writing your own overrides.  If that works for you, then awesome.

For others, reset styles are more of an impediment.  That’s only to be expected; we all work in different ways.  The key here, and the reason I made the approving comment above, is that you evaluate various tools by thinking about how they relate to the ways you do what you do—and then choose what tools to use, and how, and when.  That’s the mark of someone who thinks seriously about their craft and strives to do it better.

I’m not saying that craftsmen/craftswomen are those people who reject the use of common tools, of course.  I’m saying that they use the tools that fit them best and modify (or create) tools to best fit them, applying their skills and knowledge of their craft to make those decisions.  It’s much the same in the world of programming.  You can’t identify a code craftsman by whether or not they use this framework or that language.  You can identify them by how they decide which framework or language to use, or not use, in a given situation.

Craftsmanship is something I’ve been thinking about quite a bit recently, as has Joshua Porter.  I delivered a keynote address on that very topic just a few days ago in Minneapolis, and my thinking infuses both of the talks I’m giving next week at An Event Apart New Orleans.  I’ve started looking harder for evidence of it, both in myself and in what I see online, and I believe striving toward being a craftsman/craftswoman is an important process for anyone who chooses to work in this field.

Because this isn’t a field of straightforward answers and universal solutions.  We are often faced with problems that have multiple solutions, none of them perfect.  To understand what makes each solution imperfect and to know which of them is the best choice in the situation—that’s knowing your craft.  That’s being a craftsman/craftswoman.  It’s a never-ending process that is all the more critical precisely because it is never-ending.

So it’s no surprise that we, as a community, keep building and sharing solutions to problems we encounter.  Discussions about the merits of those solutions in various situations are also no surprise.  Indeed, they’re exactly the opposite: the surest and, to me, most hopeful sign that web design/development continues to mature as a profession, a discipline, and a craft.  It’s evidence that we continue to challenge ourselves and each other to advance our skills, to keep learning better and better how better to do what we love so much.

I wouldn’t have it any other way.


Notacon: Not to be Missed

Published 17 years, 2 months past

In just under a couple of weeks, the fifth annual NOTACON will be held right here in beautiful Cleveland, Ohio.  You’re going, I know; you’re super-über-cool like that, and you don’t need to be reminded of your coolness.  But I’d like to mention the show here for posterity, so that our descendants will know just how completely they missed out.

Notacon straddles, like a Colossus built entirely out of recycled motherboards, backtech chips, and loops of soldering wire, the middle ground between regular conferences and BarCamps (though Notacon predates BarCamp by a couple of years).  It’s not free to attend, but it is very inexpensive.  What it lacks in slick advertising and corporate sponsors, it makes up ten times over in raw, unfiltered geekiness and fascinating material.  This is the kind of event where presenters will hold forth on the depths of digital security, the physics of wireless networking, homebrew chip architecture, the coolness of HyperCard, online society dynamics, and more.  There’s a running contest called Anything but Ethernet, where you get bonus points for having one of the links in your network architecture incorporate barbed wire.

Yeah.  It’s like that.

The speakers will be as wildly diverse as the audience.  The lead engineer for the C64 Direct-to-TV (a C64 in a joystick!); the man behind The Daily WTF; some of the folks putting out 2600 magazine; the woman behind CrochetMe.com; and many more.  I’ll be there as well, talking about the bleeding edge of CSS and web design, ripping apart some recent projects of mine at top speed while discussing where I think we’ll be in three years.  Plus Drew Curtis of FARK fame will be back, as he always is, this year sponsoring a FARK party.  The mind fairly boggles.  Boggles!

As you’re no doubt gathering by now, it’s hard to describe Notacon in a quick, concise summary—and that’s a big part of what makes it so awesome.  For my contemporaries: see you there!  To you future historians: okay, you missed out, but drop everything right now to find out when the next one is and I’ll see you there!


CSS Tools: Reset and Diagnostics

Published 17 years, 4 months past

I’ve hinted and teased and promised, and I’ve yet to make good on any of it.  I’m sorry.  Can I make it up to you?

Okay, then, here you go: a permanent home for my reset styles.  It takes up residence in a new “CSS” subsection of the Toolbox section of the site, along with my efforts to create a generic set of diagnostic styles.  In the case of the resets, I’ll increment the version number and date whenever I make changes, and probably maintain an archive of previous versions.  Not that I expect that to happen with any frequency, but you never know.

The diagnostics are a lot more experimental and thus aren’t so formal as to have things like version numbers.  If I change anything of specific interest there, I’ll try to write a post about it.  I should get around to writing about the shown-in-page stylesheet one of these days, though I figure anyone can view source and work out the particulars without too much effort.


Non-Quotidian Problems

Published 17 years, 4 months past

After I published the latest iteration of the reset styles, Paul Chaplin pointed out that my simplification of the quote-suppressing rules actually broke the intended effect in Safari 2, Gecko variants , and so on.  This happened because I assumed support for quotes: none, and it just isn’t there in most browsers.  Apparently, I was testing IN THE FUTURE! that day.

“Well, no problem,” I thought to myself, “I’ll use content: none instead”.  Nope.  Even in browsers that support generated content, support for the content value none appears to have fallen through the cracks.  Using it completely fails to suppress the generation of content, so far as my testing can determine.  Even more amusingly, content: normal prevents the insertion of quotation marks in Camino (and probably other Geckos), but not Safari.

So we’re back to explicitly forcing the assignment of empty content boxes in order to stop the insertion of quote marks: content: '';.  Oh, joy.  Paul had come to the same conclusion, and worked out a nice little fallback set that reminds me of the cursor trick that gets you a hand-pointing icon in both IE and the rest of the world, only his trick is completely valid.  So I’ll be adding that in, along with some thanks to Paul.

For my next magic trick, maybe I’ll base a reset rule on :nth-child() and see what people invent to simulate the intended effect.


Browse the Archive

Earlier Entries

Later Entries