Posts in the Standards Category

The Constants Gardener

Published 19 years, 2 months past

This news is a little musty, but Shaun Inman updated CSS-SSC recently.  If you’re using CSS-SSC, you should definitely go grab the update.

“Hey, what’s CSS-SSC?” you exclaim?  Oh, I’m sorry.  It stands for Cascading Style Sheets Server-Side Constants.  Here’s Shaun’s initial example:

@server constants {
    linkColor: #AB6666;
    linkColorHover: #710101;
    }
a { color: linkColor; }
a:hover { color: linkColorHover; }

In other words, you can define your own constants in CSS.  This works because CSS-SSC is a preprocessor—it processes the style sheet before it’s sent to the browser, and turns it into something the browser can handle.  (Put another way, what arrives at the browser is a regular style sheet, with none of the ‘SSC’ information.)  Shaun offers more details in an earlier post.  CSS-SSC requires you to have PHP hanging about, and also to edit some stuff on your server, like .htaccess files.  You’ll also have to be careful about how you name your constants: use the constant name color, for example, and your CSS is going to go to a particularly mangled form of textual hell.

Personally, I’m both enthused and annoyed by CSS-SSC.  I think it’s a great solution: definitely one of the best, lightest-weight, easiest approaches to adding preprocessing to CSS.  I’m seriously considering putting it to use on ALA, in which I jumped through a few grouping hoops in order to get the fonts and colors just the way Jason wanted them.  Dropping back to constants would make life a bit easier—and would also simplify the whole “per-issue coloration” feature.  (Which I already have working, but via a large number of hoops, several of them on fire.)

I’m annoyed because it bothers me that Shaun had to create CSS-SSC in the first place.  There have been occasional requests for constants in CSS.  They get shot down every time.  “Use a preprocessor!” is the cry, and at first glance, CSS-SSC would seem to give credence to that response.  From my point of view, however, CSS should have had constants long ago, and what Shaun has done is proof.

The refusal to add constants as a feature of CSS has always stuck me as highly pointless.  Over the past decade, many people have expressed a need for CSS constants in a number of fora, and it’s a good bet many more have had the need without publicly expressing it.  Adding it to CSS would have done little to increase complexity on the implementor’s side; Shaun’s one-page PHP script (a good deal less when you remove the comments) proves that.  Adding it to CSS would have meant authors could just do it, without having to install anything else first.  Shaun’s made installation about as easy as it gets, but it’s still three or four steps more than should exist—and, for some authors, three or four impossible steps, due to their hosting situation.  And if you aren’t running a local web server, then you can’t test your CSS-SSC-enhanced styles locally; they’ll have to go to a web server first.

Because CSS still lacks, and will apparently continue to lack, a way to define your own constants, I’m really glad Shaun has devised this low-threshold solution.  I just wish that it hadn’t been necessary for him to do so in the first place.


Increasing the Strength of Ajax

Published 19 years, 5 months past

There’s been some comment recently about how Ajax programming requires a different approach to UI and user notification.  What Jeff Veen wrote about Designing for the Subtlety of Ajax, and Alex Bosworth‘s post on the top 10 Ajax Mistakes, are just two examples.

I pretty much agree with both pieces.  I’ve missed upates more than once on Ajax pages, just because I’m too used to how pages usually work.  I’ll click on something and then my attention will, out of habit, instantly go elsewhere—another window, another application, another computer, whatever—and keep subconscious track of what was happening in the window where I clicked, monitoring it in my peripheral vision for the flicker of a page reload.  Eventually there will be a little tickle in the back of my brain that says, “Hey, didn’t that site ever do anything?”  When I finally look straight at it, I realize that it did something quite a while ago, probably a split second after my mental focus moved away.  Instead of being efficient, I was wasting time waiting for a refresh that never came.

One might think it’s time for an “Ajax enabled” badge on pages so we know “better pay attention, ’cause this ain’t your father’s Web page”.  I don’t think that’s the way to go, however.  I think what’s needed is a more mature HCI design sense.  Web design has long relied on the page-update refresh to tell the user something has happened; this was such a part of the Web’s fabric that designing around it was almost unconscious.  There hasn’t been a need for sophisticated HCI considerations… until now.

In other words, Web design is going to need to grow up, and become more HCI-oriented than it has been.  The usability of a Web site will become as much about how you let the user know they’ve done something as it is about getting them to the thing they want to do.  In addition to getting the page to look inviting and present the information well, it will be necessary to obsess over the small details, implement highlights and animations and pointers—not to wow the user, but to help them.

In this endeavor, it’s worth remembering that there is a very large and long-standing body of research on HCI.  For years, many HCI experts have complained that the Web design field is making all sorts of errors that could be avoided if we’d just pay attention to what they’re telling us—a criticism which was not totally inaccurate.  Some Web design experts shot back that the Web was a different medium than the sorts of things HCI people studied, and anyway, the Web was not an application—and that rejoinder was also not totally inaccurate.  But with Ajax, the Web-application dichotomy is disappearing.  The retort is becoming less accurate, and the criticism more accurate.

I don’t claim to know what should be done.  The simplest update notification would be to set the visibility of the body element to hidden for half a second, and then back to visible, thus visually simulating a page refresh.  Crude, but it would play directly to users’ expectations.  The fading yellow highlight in Basecamp gets a lot of attention (and imitators), and that’s a good way to go too.  We could envision tossing a red outline onto something that changed, or animating a target-reticle effect on the updated content, or any number of other ideas.  Again I say: the decades of work done in HCI research are a resource we should not ignore.

From my perspective, there are at least two good things in the Ajax world.  First is that the need for understanding and using CSS, XHTML, and the DOM has never been greater.  Okay, it’s a slightly selfish thing, but it leads directly into the second good thing: that the need for standards support has never been more critical.  If a browser wants to play in the Ajax space—if it wants to be a serious platform for delivering applications—then it’s going to have to get along with the others.  It’s going to have to support the standards.


Localized Content

Published 19 years, 8 months past

There’s the World Tour, which is exciting and all that, but then there’s the fun of hometown gigs.  It just so happens that I have two coming up in the near future.  Here they are, along with some brief information on what I’ll be talking about.

NOTACON 2

Conference dates: 8-10 April 2005.  Both my talks will be the morning of Saturday, 9 April.

  • The Construction of S5; or, How I Learned to Stop Worrying and Love the DOM — a look at where S5 started, why it grew, how its growth was accomplished, what kinds of design decisions had to be made, and where it’s headed in the future.
  • Humanely Wielding a Clue By Four: Reflections on Managing a Massive Mailing List — a look back at three-plus years of managing css-discuss, this will be an unvarnished peek into the life of a list administrator who actually cares about the list.

Northeast Ohio Chapter of the Usability Professionals’ Association (NEOUPA)

Metting date: Tuesday, 24 May 2005.

  • Rapid UI Prototyping and Easy A/B Usability Testing with CSS — an updated version of the short talk Molly Holzschlag and I gave last year at User Interface 9, this will look at ways to use CSS and standards-oriented design to quickly develop UI designs and to do comparison testing of different designs.

Neither event is free, but then, neither event is terribly expensive, either.  So come on down!  They should be a peck o’ fun.

I’m also speaking at the NASA Glenn Research Center this afternoon, but that’s an internal gathering, not a public event, which is why I didn’t mention it before.  I haven’t been there since the late 1980s, so it’ll be interesting to see if anything’s really different.


That Acid Buzz

Published 19 years, 8 months past

Just a few days after Chris Wilson’s post to IEblog, Håkon Wium Lie, CTO of Opera and one of the originators of CSS, published a column on news.com titled “The Acid2 Challenge to Microsoft“, outlining the intent to create “a test page… that will actively use features Web designers crave, such as fixed positioning of elements”.  As indicated in his article and confirmed via the Buzz, the Web Standards Project  is a partner with Håkon in the development of this new “test suite”, as it’s termed on the WaSP Buzz.

I don’t know about you, but as I read the article, several red flags went up and alarm bells rang in my head.

First off, the Acid2 challenge to Microsoft?  Why only Microsoft?  An acid test worthy of its name would expose bugs in every browser on the market today.  The original test did exactly that, and helped change the face of the Web.  In fact, if you’re still using IE/Mac, the first browser to actually get the Acid test correct, you can see it in action.  Type about:tasman into the address bar, and there it is, with modified text.

Second, the original Acid test (which I haven’t linked to because it seems to longer be available on the Web) was part of a larger and more constructive effort.  At the time, Acid test author Todd Fahrner was (as was I) a member of the WaSP’s CSS Action Committee.  If that name doesn’t sound familiar, you might be more familiar with the CSS Samurai.  One of the things the CSS AC did was to produce reports on the top ten failings of CSS support in various browsers.  We didn’t just say, “Browser X should be better”.  We wrote up what should be better, and why, and pointed to test cases illustrating the problems.  The Acid test was justifiably famous, but it was in the end one test among many.

And those tests were tough for all browsers, not just one browser or one campany’s browsers.  We weren’t partisan snipers, despite what many claimed.  We worked to point the way toward better behavior on the part of all browsers by focusing on the problems specific to each browser.

I am no longer a member of the WaSP.  When the first incarnation of the organization went into dormancy, the CSS AC went along with it.  Although the new WaSP has asked me to join a few times, I have resisted for various reasons—personal, professional, and perceptual.  I was also asked if I wanted to contribute to the Acid2 effort as an independent, and declined that as well.  So in many ways, this post is the epitome of something I find distasteful: a person who has had every chance to make contributions, and instead criticizes.  In my defense, I can say only that while I may have refused these invitations, it is not out of antagonism to the basic goal of the WaSP.  I have every reason to want the WaSP to succeed in its goal of advancing the cause of standards on the Web.

But this Microsoft-centric campaign has me concerned, and ever so slightly appalled.  The creation of a tough CSS test suite is a fantastic undertaking, something that is to be applauded and is probably long overdue.  But to cast it as an effort being undertaken as a challenge to Microsoft not only starts it off on the wrong foot, it has the potential to taint not just the Acid2 effort, but the entire organization.


Exploring Better Standards Support

Published 19 years, 8 months past

While I was preparing for SXSW, Chris Wilson—and there’s a name that takes me back a few years—posted an entry on IEblog about standards.  I’m not going to excerpt any of it here because most of you have already read it.  For the rest of you, go read it.  As long as you don’t continue into the comments, it won’t take very long.

First off, let me say that I’ve known Chris for many years, and we get along great together.  I have a lot of respect for him, and I firmly believe the feeling is mutual.  He did incredible work in the very early years of CSS, and while some of that work may seem lacking when viewed in light of later implementations, it was that all-important first step on the journey of a thousand miles.  If I ever make it to Seattle with a couple of days to spare, he’s right near the top of a pretty short list of people I’d do my utmost to find time to see while I was there.  (I just added another person to that list a couple of days ago, actually, but that’s a story for another time.)

With a paragraph like that, you probably think I’m going to tear into him now.  Nope.

I’m posting my thoughts on this for three reasons.

  1. Chris was nice enough to name-check meyerweb as a site that’s helped “[harvest] the collective knowledge of the web development community” with regard to standards.  If nominations were being taken, I’d point to the css-discuss wiki before I would meyerweb, but nevertheless I’d like to think I’ve earned a place on that list—and I’m glad that Chris thinks so too.
  2. Some of my writing from the post “Unbreaking the Web” was quoted in a comment by Thomas Tallyce.
  3. The 800-pound gorilla is stirring.  It’s hard not to share a few observations.

So as Chris points out, the IE team faces an enormous challenge.  This is compounded by the enormous loss of IE developers over the past few years.  Think about it.  Would you work on a project that was the legal and administrative equivalent of a toxic cloud?  Internet Explorer is the focal point of dozens of lawsuits, antitrust litigation, and more.  It’s a project straitjacketed by its own success (however rightly or wrongly that success was achieved).  I don’t have any direct knowledge of this, but the IE team has probably become the Marie Celeste of Microsoft, a doomed wanderer of the bureaucratic seas, staffed by a few trapped souls and the subject of whispered tales of horror among the engineers.

( “And there… dangling from the door handle… was a scripting hook!” )

Despite this recent legacy of pariahship, it would seem that resources are gathering behind Explorer, and not just on the security front.  Chris says, and I have no reason to doubt him, that plans are afoot to add standards support to Explorer.  My concern is over the fate of those plans, because the best-laid plans… you know.  No matter how much the engineering team might want something, if their irresistible geek force encounters an immovable administrative object, well, my money’s on the object.  The only hope is to interpret the object as damage and route around it, which is usually a lot harder to accomplish in a bureaucracy than it is in a network topology.

Chris’ post makes it very clear that backwards compatibility will not be sacrificed, at least in quirks mode.  I wrote some thoughts along those lines in “Unbreaking the Web“, so I won’t repeat them here.  In summary: improving standards support will not break the Web.  It won’t even break the vast majority of sites, and any sites that do break will get sorted out in short order.  With a public beta, those problems could be identified and solved well before the browser went final.  Backwards compatibility is no longer a reasonable excuse for avoiding standards support.

And then there are the resource limitations.  It’s hard to think of anything Microsoft does as lacking in resources, but just as there are hungry people in America, there are starving programs within Microsoft.  I believe that, for some time now, the IE project has been living on a sub-subsistence diet.  It will probably be hard to attract people to help feed it.  The staffing requirements for regression testing alone would be daunting.  I don’t envy the IE managers their task—all the more so because no matter what they do, it won’t be enough for some people.  They’re going to get slammed.  Their only real choice is in trying to pick the things for which they’ll be slammed less.  If improving standards support in IE isn’t a corporate (or divisional) priority, they’re in for a world of hurt.  Which is why I sincerely hope they’re a priority.

But neither is that a complete excuse.  Working for a firm like Microsoft means taking on massive challenges, doing more than you thought possible with less than you should have available, pulling long hours and pounding your head against a wall in order to do the apparently impossible.  That’s part of the job description, and being there is pretty much a matter of choice.  I say this isn’t a complete excuse because, obviously, any given team can only accomplish so much.  It just isn’t a “get out of jail free” card.  If you’re going to tell us that standards are important and that support will be improved, it has to be a notable degree.  There has to be evidence that a lot of work went into adding a lot of useful things, and fixing a lot of old problems.  Again, this is because the promise was freely made, not because it’s what the Web Gods demand.

We all, and by that I mean “us Web designers and developers”, need to stay involved in this conversation.  It’s easy to post a few thoughts, assume that they’ve been ignored, and let things drift.  It’s also easy to assume that the entire IE team read your ideas and immediately agreed to every single last one of them because they’re so blindingly obviously critical, and then get completely enraged when they don’t show up in the final product.  I for one plan to keep an eye on this situation, and to think about ways I could help the IE team.

Because if I truly care about standards—and I do—then I owe the IE team as much as I’ve given to the teams working on Firefox, Safari, Opera, and all the rest.  We all do.  Whatever we would have done for the least of these our browsers in the name of advancing standards support, we owe the Explorer team no less.

Chris did ask for specific requests, so here are my top ten CSS requests in priority order:

  1. Support all selectors—including CSS3 selectors, which I believe are stable enough to be implemented
  2. Clean up positioning and add fixed positioning
  3. Clean up floating/clearing
  4. min-/max-width/height (got that?)
  5. Fix problems with inline layout, especially the handling of top and bottom padding and margins on inline elements
  6. Arbitrary-element hover
  7. Focus styling for form elements
  8. Better printing support, including better page-break control and page orientation
  9. Support CSS table styling, including the table-centered display values
  10. Support generated content

…plus the unranked but still very important “fix bugs! fix bugs! fix bugs!”.

Did I miss anything important, or under- or over-value anything, on that CSS list?  Let us know.


Part of the SXSW Herd

Published 19 years, 8 months past

Okay, everyone else is doing it, so here’s my “headed to SXSW” post.  Baaaa!

I’m getting in Saturday afternoon, just in time to miss Jeffrey’s opening remarks, and will be around through Tuesday.  Early Saturday evening, I’ll be at the WaSP/WD-L/CSS-D/WSM/AIR meetup at Buffalo Billiards; the festivities kick off at 6:00pm, and no RSVP is needed, so drop on by!  It promises to be a madhouse (a MADHOUSE!) of standards and billiards.

On Sunday, I’ll be speaking from 10:00am until 11:00am on the topic of “Emergent Semantics“.  I’ve been scheduled to do a half-hour book signing at 12:45pm that same day; if it’s anything like last year, there will be a few authors sitting at a signing table at the same time.  I’m told there will be copies of Cascading Style Sheets: The Definitive Guide, Second Edition, the CSS Pocket Reference, Second Edition, and More Eric Meyer on CSS in stock, but you don’t have to buy a copy at the show to get one signed—if you already have a book of mine, bring it and I’ll gladly sign it.  Heck, bring me anyone’s book and I’ll sign it.  I’m easy.

After the signing, I’m planning to sit in on Tantek’s presentation, “The Elements of Meaningful XHTML“.  His talk and mine make a good one-two combination, so if you’ve any interest in either, you might consider checking out both.

Come Monday, I’ll be getting a late start with an appearance on the panel “Where Are The Women of Web Design“.  Before I get raked over the coals again, I’d like to point you to Molly’s post about the panel and its genesis, as well as the comments that followed (particularly this one).  Just after that panel, assuming I haven’t been burnt to a tiny crisp, the SXSW folks have me doing another book signing.  That’ll be from 4:45pm until 5:15pm.

As soon as I’m done there, I have to skedaddle over to the Red Eyed Fly for Vox Nox, an early evening of New Riders authors showing their “B” sides.  Vox Nox, which starts around 6:00pm and is scheduled to end around 8:00pm, is in many ways a sort of mini-Fray Cafe, which is appropriate… because Fray Cafe 5 is going to be held at the Red Eyed Fly on Sunday, the night before Vox Nox.  I have to admit to being a touch nervous about my part in Vox Nox, because the piece I’ve created is deeply personal and I’m not totally certain how the audience will react.  But that’s one of the interesting things about public performance, right?

Tuesday I got nothin’.  No scheduled events at all.  I can just kick back, check out sessions, hang out in the halls, and generally act like I don’t have a care in the world.  Trust me, it’s just an act, but I’ve gotten kind of good at it.  That evening I’m doing whatever, and by the next morning I’ll be gone.

So if you’re going to be in or around SXSW, come over and say “hi”.  Even if you don’t have that much interest in me personally, you should still come by, because the concentration of Web design stars, standards gurus, and forward thinkers assembling at this year’s SXSWi is frankly a wonder to behold.


Search Engine Strategies New York

Published 19 years, 8 months past

Talking with attendees and hanging out with the speakers at Search Engine Strategies was quite fascinating. 

In the first place, they’re all pretty fascinating people, from where they’ve been to what they’re like now.  In the second place, they’re all working in a field that doesn’t really interest me, except in indirect ways.  A lot of the “white hat” search engine friendliness has to do with strong text copy, building traffic, and all that good stuff.  But to spend my days picking apart search engine behavior?  Not interested.

Of course, a lot of people would find what I do eye-wateringly boring, so I’m not casting stones here.  Just saying that it’s interesting to spend time with people who are smart, funny, motivated, and gladly doing something very different from what I’m used to doing.

That said, I observed some interesting differences between the search engine crowd and the Web design/standards crowd.

  • There’s a dark side to the search engine business that just doesn’t exist in the standards crowd.  The “black hat” SEOs, the ones who are comment spamming and keyword stuffing and link farming, don’t just lurk in the shadows.  They’re right up front, sitting on panels and buying booths in the exhibit hall (not to mention doing a little in-person spamming).  They don’t pretend to be anything but what they are.  The honesty is refreshing, but it’s something that doesn’t have a direct analogue in Web design.  The closest we get is coding to a specific browser, and that isn’t evil so much as it is amateurish and short-sighted.  I don’t think there’s really any comparison.

    The existence of that dark side creates an entirely different dynamic in the search engine field.  People are always watching to see if someone’s white hat is covering up a black hat, to see who’s shifted from one camp to another.  From what I heard, people have gone both directions; some black hats have gone to white over time.  And vice versa.

    This fact also seems to have created a gossip stream that completely dwarfs anything I’ve ever encountered in the standards design field.

  • In a similar vein, there’s an incentive to keep one’s knowledge to oneself in the search engine business.  Suppose you’ve uncovered something about search engines that nobody else has figured out.  That’s a competitive advantage, and there’s a financial incentive to keep it to yourself.

    In the standards design field, it’s almost the other way around.  If you come up with a new technique, you’re better off publishing it and adding to your reputation.  You could keep it to yourself, of course, and that would stay secret until the first time you used it on a public site.  At that point, the secret will be there for anyone who views source to figure out and use for themselves.  Writing it up instead and sharing it with the world adds to your reputational capital, which might lead to more work—so there’s a financial incentive to share.

    That’s not to say that everything search engine experts uncover is kept secret: they do plenty of publishing and sharing, and consultants in the field are constantly referring clients to each other as needs change.  That’s sort of a flip side to what I’ve observed in the standards design field, where referrals seem to be (comparatively) infrequent.  I’m not complaining, mind you.  Just observing.  But when someone creates a unique approach, it’s more likely to benefit them by being held close to the chest.

  • The field is dominated by the search engines.  Whatever they do, the experts have to adjust to keep up.  If Google alters its algorithm, a top-ranked site can drop to 100th place in an instant, and a ninth-page site can vault to the first results page.  The playing field is always shifting, always in flux.  Slow flux, but flux nonetheless.  It’s actually a lot like Web design was back in the late 1990s, when browsers were updating their rendering engines on a regular basis, instead of in cycles that can be reasonably measured in fractions of decades.

    So there’s the threat that today’s winning strategy is tomorrow’s loser.  In the standards design space, not really the case; or if it is the case, it’s only on much longer time scales.  Sure, CSS will likely be a discarded relic some day, but it’ll probably be quite a while—several years at the very least.  Comment spamming could become obsolete next week, were the engines to figure out a way to programmatically detect and penalize it.  (nofollow doesn’t quite count, but it’s a start.)

  • On a related note, there’s a lot more mobility in the search engine space.  People work as independent SEOs, then go to work for a search engine, then shift to an SEO firm, leave that to work for a large corporation… and so on.  Not everyone, of course, but enough to add lots more grist for the gossip mill.  In the standards design space, most of the leading names are working for themselves, and show few signs of changing.

  • The last observation is perhaps the one that drives everything that I’ve mentioned: the money.  There’s a lot of money on the table in SEO, way more than in standards design.  Sure, a big design job can be worth many thousands of dollars.  An effective SEO can make many more thousands, possibly millions if he or she gets the right job.  They can increase a company’s traffic, and potentially their revenue, by large percentages.

    Certainly, standards design can save companies money, and it can increase revenue by making a site more responsive due to smaller page weights.  That’s useful, and it’s important.  But the money being thrown around on SEO is… well, it’s a lot.

Lest anyone get the wrong idea, none of this is meant to be a condemnation.  Sure, the spammers are loathsome parasites, but there are a lot of SEOs who aren’t spammers.  They get companies better rankings through the basics I mentioned before.  In a lot of ways, they seem to be content, usability, and community-building consultants all rolled up into one.  Those are all useful, needed services, and it’s kind of interesting to me that all those things are hiding behind the term “search engine optimization”.  Well, not hiding, exactly.  You see what I mean, though, right?

The last observation is more personal: it was quite an experience attending a conference where I was largely unrecognized.  There were developers there who knew my name, and who were on the standards bandwagon, but the majority of attendees were not developers and had never heard of me, or Zeldman, or Shea, or Bowman, or any of the other names known in our field.  Which is only to be expected: I had never heard of most of the big names in their field.  So I was largely an outsider, and that was a refreshing change of pace.  It served as a (possibly necessary) ego check, and let me look at the Web from an entirely new angle.

So my thanks to Danny, Chris, Grant, Shari, Amanda, Tim, Matt, and the other folks who helped orient me to this new arena, discussed points of common interest and divergent aims, and made sure I didn’t feel too terribly out of place.


Keep Your Classes Clean

Published 19 years, 8 months past

A picture of three bottles of the general-purpose cleaner 'Simple Green'.  The first contains a dark green liquid, as you might except given its name.  The second contains yellow (lemon-scented) liquid, yet is still called 'Simple Green'.  The third is a white bottle with a purple label; again, it has the name 'Simple Green' prominently displayed.

See, that’s why presentational class names are such a bad idea.


Browse the Archive

Earlier Entries

Later Entries