Posts in the Browsers Category

Increasing the Strength of Ajax

Published 19 years, 6 months past

There’s been some comment recently about how Ajax programming requires a different approach to UI and user notification.  What Jeff Veen wrote about Designing for the Subtlety of Ajax, and Alex Bosworth‘s post on the top 10 Ajax Mistakes, are just two examples.

I pretty much agree with both pieces.  I’ve missed upates more than once on Ajax pages, just because I’m too used to how pages usually work.  I’ll click on something and then my attention will, out of habit, instantly go elsewhere—another window, another application, another computer, whatever—and keep subconscious track of what was happening in the window where I clicked, monitoring it in my peripheral vision for the flicker of a page reload.  Eventually there will be a little tickle in the back of my brain that says, “Hey, didn’t that site ever do anything?”  When I finally look straight at it, I realize that it did something quite a while ago, probably a split second after my mental focus moved away.  Instead of being efficient, I was wasting time waiting for a refresh that never came.

One might think it’s time for an “Ajax enabled” badge on pages so we know “better pay attention, ’cause this ain’t your father’s Web page”.  I don’t think that’s the way to go, however.  I think what’s needed is a more mature HCI design sense.  Web design has long relied on the page-update refresh to tell the user something has happened; this was such a part of the Web’s fabric that designing around it was almost unconscious.  There hasn’t been a need for sophisticated HCI considerations… until now.

In other words, Web design is going to need to grow up, and become more HCI-oriented than it has been.  The usability of a Web site will become as much about how you let the user know they’ve done something as it is about getting them to the thing they want to do.  In addition to getting the page to look inviting and present the information well, it will be necessary to obsess over the small details, implement highlights and animations and pointers—not to wow the user, but to help them.

In this endeavor, it’s worth remembering that there is a very large and long-standing body of research on HCI.  For years, many HCI experts have complained that the Web design field is making all sorts of errors that could be avoided if we’d just pay attention to what they’re telling us—a criticism which was not totally inaccurate.  Some Web design experts shot back that the Web was a different medium than the sorts of things HCI people studied, and anyway, the Web was not an application—and that rejoinder was also not totally inaccurate.  But with Ajax, the Web-application dichotomy is disappearing.  The retort is becoming less accurate, and the criticism more accurate.

I don’t claim to know what should be done.  The simplest update notification would be to set the visibility of the body element to hidden for half a second, and then back to visible, thus visually simulating a page refresh.  Crude, but it would play directly to users’ expectations.  The fading yellow highlight in Basecamp gets a lot of attention (and imitators), and that’s a good way to go too.  We could envision tossing a red outline onto something that changed, or animating a target-reticle effect on the updated content, or any number of other ideas.  Again I say: the decades of work done in HCI research are a resource we should not ignore.

From my perspective, there are at least two good things in the Ajax world.  First is that the need for understanding and using CSS, XHTML, and the DOM has never been greater.  Okay, it’s a slightly selfish thing, but it leads directly into the second good thing: that the need for standards support has never been more critical.  If a browser wants to play in the Ajax space—if it wants to be a serious platform for delivering applications—then it’s going to have to get along with the others.  It’s going to have to support the standards.


Liberal vs. Conservative

Published 19 years, 6 months past

So it turns out that crackers can mess up your Web site with nothing more than a malformed HTTP packet.  You might think something as simple as HTTP would be basically risk-free, but no, I’m afraid not.  All it takes is interaction between programs that handle HTTP data slightly differently, and hey presto, you’ve got a security hole.

Ben Laurie weighed in on this:

“It is interesting that being liberal in what you accept is the base cause of this misbehaviour,” Laurie says. “Perhaps it is time the idea was revisited.”

That’s a reference to the late Jon Postel‘s dictum (from RFC 793) of “be conservative in what you do, be liberal in what you accept from others”.  This is done in the name of robustness: if you’re liberal in what you accept, you can recover from data corruption caused by unanticipated problems.

Laurie’s right.  The problem is that being liberal in what you accept inevitably leads to a systemic corruption.  Look at the display layer of the Web.  For years, browsers have been liberal in what markup they accept.  What did it get us?  Tag soup.  The minute browsers allowed authors to be lazy, authors were lazy.  The tools written to help authors encoded that laziness.  Browsers had to make sure they could deal with even more laziness, and the tools kept up.  Just to get CSS out of that death spiral, we (as a field) had to invent, implement, and explain DOCTYPE switching.

In XML, it’s defined that a user agent must throw an error on malformed markup and stop.  No error recovery attempts, just a big old “this is broken” message.  Gecko already does this, if you get it into full-on XML mode.  It won’t do it on HTML and XHTML served as text/html, though, because too many Web pages would just break.  If you serve up XHTML as application/xml+xhtml, and it’s malformed, you’ll be treated to an error message.  Period.

And would that be so bad, even for HTML?  After all, if IE did it, you can be sure that people would fix their markup.  If browsers had done it from the beginning, markup would not have been malformed in the first place.  (Weird and abnormal, perhaps, but not actually malformed.)  Håkon said five years ago that “be liberal in what you accept” is what broke the Web, markup- and style-wise.  It’s been a longer fight than that to start lifting it out of that morass, and the job isn’t done.

Authors of feed aggregators have similar dilemmas.  If someone subscribes to a feed, thus indicating their interest in it, and the feed is malformed, what do you do?  Do you undertake error recovery in an attempt to give the user what they want, or do you just throw up an error message?  If you go the error route, what happens when a competitor does the error recovery, and thus gets a reputation as being a better program, even though you know it’s actually worse?  That righteous knowledge won’t pay the heating bills, come winter.

“So what?” you may shrug.  “It’s not like RSS feeds can be used to breach security”.

Which is just what anyone would have said about HTTP, until very recently.

In the end, the real problem is that liberal acceptance of data will always be used.  Even if every single HTTP implementor in the world got together and made sure all their implementations did exactly the same strictly correct conservatively defined thing, there would still be people sending out malformed data.  They’d be crackers, script kiddies—the people who have incentive to not be conservative in what they send.  The only way to stop them from sending out that malformed data is to be conservative in what your program accepts.

Even then, it might be possible to exploit loopholes, but at least they’d be flaws in the protocol itself.  Finding and fixing those is important.  Attempting to cope with the twisted landscape of bizarrely interacting error-recovery routines is a fool’s errand at best.  Unfortunately, it’s an errand we’re all running.


That Acid Buzz

Published 19 years, 9 months past

Just a few days after Chris Wilson’s post to IEblog, Håkon Wium Lie, CTO of Opera and one of the originators of CSS, published a column on news.com titled “The Acid2 Challenge to Microsoft“, outlining the intent to create “a test page… that will actively use features Web designers crave, such as fixed positioning of elements”.  As indicated in his article and confirmed via the Buzz, the Web Standards Project  is a partner with Håkon in the development of this new “test suite”, as it’s termed on the WaSP Buzz.

I don’t know about you, but as I read the article, several red flags went up and alarm bells rang in my head.

First off, the Acid2 challenge to Microsoft?  Why only Microsoft?  An acid test worthy of its name would expose bugs in every browser on the market today.  The original test did exactly that, and helped change the face of the Web.  In fact, if you’re still using IE/Mac, the first browser to actually get the Acid test correct, you can see it in action.  Type about:tasman into the address bar, and there it is, with modified text.

Second, the original Acid test (which I haven’t linked to because it seems to longer be available on the Web) was part of a larger and more constructive effort.  At the time, Acid test author Todd Fahrner was (as was I) a member of the WaSP’s CSS Action Committee.  If that name doesn’t sound familiar, you might be more familiar with the CSS Samurai.  One of the things the CSS AC did was to produce reports on the top ten failings of CSS support in various browsers.  We didn’t just say, “Browser X should be better”.  We wrote up what should be better, and why, and pointed to test cases illustrating the problems.  The Acid test was justifiably famous, but it was in the end one test among many.

And those tests were tough for all browsers, not just one browser or one campany’s browsers.  We weren’t partisan snipers, despite what many claimed.  We worked to point the way toward better behavior on the part of all browsers by focusing on the problems specific to each browser.

I am no longer a member of the WaSP.  When the first incarnation of the organization went into dormancy, the CSS AC went along with it.  Although the new WaSP has asked me to join a few times, I have resisted for various reasons—personal, professional, and perceptual.  I was also asked if I wanted to contribute to the Acid2 effort as an independent, and declined that as well.  So in many ways, this post is the epitome of something I find distasteful: a person who has had every chance to make contributions, and instead criticizes.  In my defense, I can say only that while I may have refused these invitations, it is not out of antagonism to the basic goal of the WaSP.  I have every reason to want the WaSP to succeed in its goal of advancing the cause of standards on the Web.

But this Microsoft-centric campaign has me concerned, and ever so slightly appalled.  The creation of a tough CSS test suite is a fantastic undertaking, something that is to be applauded and is probably long overdue.  But to cast it as an effort being undertaken as a challenge to Microsoft not only starts it off on the wrong foot, it has the potential to taint not just the Acid2 effort, but the entire organization.


Exploring Better Standards Support

Published 19 years, 9 months past

While I was preparing for SXSW, Chris Wilson—and there’s a name that takes me back a few years—posted an entry on IEblog about standards.  I’m not going to excerpt any of it here because most of you have already read it.  For the rest of you, go read it.  As long as you don’t continue into the comments, it won’t take very long.

First off, let me say that I’ve known Chris for many years, and we get along great together.  I have a lot of respect for him, and I firmly believe the feeling is mutual.  He did incredible work in the very early years of CSS, and while some of that work may seem lacking when viewed in light of later implementations, it was that all-important first step on the journey of a thousand miles.  If I ever make it to Seattle with a couple of days to spare, he’s right near the top of a pretty short list of people I’d do my utmost to find time to see while I was there.  (I just added another person to that list a couple of days ago, actually, but that’s a story for another time.)

With a paragraph like that, you probably think I’m going to tear into him now.  Nope.

I’m posting my thoughts on this for three reasons.

  1. Chris was nice enough to name-check meyerweb as a site that’s helped “[harvest] the collective knowledge of the web development community” with regard to standards.  If nominations were being taken, I’d point to the css-discuss wiki before I would meyerweb, but nevertheless I’d like to think I’ve earned a place on that list—and I’m glad that Chris thinks so too.
  2. Some of my writing from the post “Unbreaking the Web” was quoted in a comment by Thomas Tallyce.
  3. The 800-pound gorilla is stirring.  It’s hard not to share a few observations.

So as Chris points out, the IE team faces an enormous challenge.  This is compounded by the enormous loss of IE developers over the past few years.  Think about it.  Would you work on a project that was the legal and administrative equivalent of a toxic cloud?  Internet Explorer is the focal point of dozens of lawsuits, antitrust litigation, and more.  It’s a project straitjacketed by its own success (however rightly or wrongly that success was achieved).  I don’t have any direct knowledge of this, but the IE team has probably become the Marie Celeste of Microsoft, a doomed wanderer of the bureaucratic seas, staffed by a few trapped souls and the subject of whispered tales of horror among the engineers.

( “And there… dangling from the door handle… was a scripting hook!” )

Despite this recent legacy of pariahship, it would seem that resources are gathering behind Explorer, and not just on the security front.  Chris says, and I have no reason to doubt him, that plans are afoot to add standards support to Explorer.  My concern is over the fate of those plans, because the best-laid plans… you know.  No matter how much the engineering team might want something, if their irresistible geek force encounters an immovable administrative object, well, my money’s on the object.  The only hope is to interpret the object as damage and route around it, which is usually a lot harder to accomplish in a bureaucracy than it is in a network topology.

Chris’ post makes it very clear that backwards compatibility will not be sacrificed, at least in quirks mode.  I wrote some thoughts along those lines in “Unbreaking the Web“, so I won’t repeat them here.  In summary: improving standards support will not break the Web.  It won’t even break the vast majority of sites, and any sites that do break will get sorted out in short order.  With a public beta, those problems could be identified and solved well before the browser went final.  Backwards compatibility is no longer a reasonable excuse for avoiding standards support.

And then there are the resource limitations.  It’s hard to think of anything Microsoft does as lacking in resources, but just as there are hungry people in America, there are starving programs within Microsoft.  I believe that, for some time now, the IE project has been living on a sub-subsistence diet.  It will probably be hard to attract people to help feed it.  The staffing requirements for regression testing alone would be daunting.  I don’t envy the IE managers their task—all the more so because no matter what they do, it won’t be enough for some people.  They’re going to get slammed.  Their only real choice is in trying to pick the things for which they’ll be slammed less.  If improving standards support in IE isn’t a corporate (or divisional) priority, they’re in for a world of hurt.  Which is why I sincerely hope they’re a priority.

But neither is that a complete excuse.  Working for a firm like Microsoft means taking on massive challenges, doing more than you thought possible with less than you should have available, pulling long hours and pounding your head against a wall in order to do the apparently impossible.  That’s part of the job description, and being there is pretty much a matter of choice.  I say this isn’t a complete excuse because, obviously, any given team can only accomplish so much.  It just isn’t a “get out of jail free” card.  If you’re going to tell us that standards are important and that support will be improved, it has to be a notable degree.  There has to be evidence that a lot of work went into adding a lot of useful things, and fixing a lot of old problems.  Again, this is because the promise was freely made, not because it’s what the Web Gods demand.

We all, and by that I mean “us Web designers and developers”, need to stay involved in this conversation.  It’s easy to post a few thoughts, assume that they’ve been ignored, and let things drift.  It’s also easy to assume that the entire IE team read your ideas and immediately agreed to every single last one of them because they’re so blindingly obviously critical, and then get completely enraged when they don’t show up in the final product.  I for one plan to keep an eye on this situation, and to think about ways I could help the IE team.

Because if I truly care about standards—and I do—then I owe the IE team as much as I’ve given to the teams working on Firefox, Safari, Opera, and all the rest.  We all do.  Whatever we would have done for the least of these our browsers in the name of advancing standards support, we owe the Explorer team no less.

Chris did ask for specific requests, so here are my top ten CSS requests in priority order:

  1. Support all selectors—including CSS3 selectors, which I believe are stable enough to be implemented
  2. Clean up positioning and add fixed positioning
  3. Clean up floating/clearing
  4. min-/max-width/height (got that?)
  5. Fix problems with inline layout, especially the handling of top and bottom padding and margins on inline elements
  6. Arbitrary-element hover
  7. Focus styling for form elements
  8. Better printing support, including better page-break control and page orientation
  9. Support CSS table styling, including the table-centered display values
  10. Support generated content

…plus the unranked but still very important “fix bugs! fix bugs! fix bugs!”.

Did I miss anything important, or under- or over-value anything, on that CSS list?  Let us know.


Tabular Weirdness

Published 19 years, 11 months past

Recently I was doing some table styling for a client and ran into what I can only call tabular weirdness.  There were two different things that I stumbled across, and interestingly, they were the kinds of problems you wouldn’t be likely to encounter in layout tables.  These would come up much more often in data tables.

In the first case, the general idea was to put some space between the tables and the surrounding material, but as these were data tables, they came with captions.  So I of course put the caption text in caption elements.  That’s when things started to get inconsistent.

To be more precise, the problems began after I left Safari to check the page in other browsers. In Safari, you see, the caption’s element box is basically made a part of the table box.  It sits, effectively, between the top table border and the top margin.  That allows the caption’s width to inherently match the width of the table itself, and causes any top margin given to the table to sit above the caption.  Makes sense, right?  It certainly did to me.

However, according to section 17.4 of CSS2.1 and the figure that accompanies it, the caption sits entirely outside the table’s box, and that includes the table’s margin.  The two are still tied together by the generation of an anonymous box, but the upshot is that if you give the table left and right margins, then the caption does not follow suit.  If you give the table a top margin, it pushes the caption away from the table. This is the behavior evinced by Firefox 1.0, and as unintuitive as it might be, it’s what the specification demands.

The third piece of strangeness was found in IE/Win.  What I’d done was simply said that some cell borders should be solid—nothing more complicated than border-bottom: 1px solid.  The idea was that it would, as borders do, pick up the foreground color of the cell, but IE/Win had other ideas.  As best I could tell, the borders were a light gray.  You can see it happen in the testcase I constructed to create the images in this entry.  Explicitly specifying a border color fixes the problem, of course, but it was a bit of weirdness I thought I’d pass along in case anyone runs into the same thing.


Don’t Care About Market Share

Published 20 years, 5 days past

In a fashion vaguely reminiscent of the process by which weeds keep growing back no matter how you try to rid yourself of them, the question of browser market share has once again been rearing its foul, misshapen head.  Dan kicked off a round of it over at Simplebits, but it’s recently been popping up other places as well.  I heard discussions about market share at SES Chicago, perhaps unsurprisingly, but I’ve also been seeing the question on various mailing lists and other forums.

The only thing more frustrating than the persistent recurrence of this unnecessary question is the inappropriate gravity it seems to acquire in so many minds.

Look, I’ll make this very simple for everyone.  If you’re trying to figure out what browsers to support (or not) in terms of layout consistency on a given site, then the answer is very easy.  Whatever the site’s access logs tell you.  End.  Of.  Story!

For example, the stats for the past few days’ worth of visitors to Complex Spiral Consulting tell me the following:

User AgentPortion of hits
Firefox 43%
IE6 30.8%
Mozilla 8.8%
Safari 8.6%
Opera 2.4%

(For those who are curious, IE5.5 makes up 0.8% of hits.  Various flavors of IE5.x below IE5.5 total roughly 1.2%, but note that Windows and Mac users are lumped together there.)

Those statistics tell me quite a bit about the people who visit the CSC site, and I can use that information to decide what to do about browser support.  You know what those numbers tell you about which browsers to support (or not) in your designs for sites on which you work?  Absolutely squat.  Anyone who uses those access statistics to make decisions for their own work is a fool, and a misinformed fool at that.

In every design, we have to ask what browsers need to have a consistent experience, which ones can be given a reduced experience, and which ones get no design at all.  The user logs from another site are useless in trying to make this decision.  The “global statistics” from firms like WebSideStory are just as useless in this case.  They may be entirely accurate, but they are also entirely irrelevant when it comes to making design support decisions.  The only stats that matter are the ones that come from the site you’re designing.

In a like manner, I don’t care if you think visitors to your site or some other favorite site of yours are an accurate reflection of the overall Internet population or not: that opinion is similarly irrelevant.  It’s rather like me claiming that the people who come to our annual holiday party are an accurate reflection of partygoers in general.  Maybe they are and maybe they aren’t, but either way I don’t think you should plan your all-night rave to accomodate the kinds of people who drop by our house to have homemade bread and soup and chat about babies, politics, science-fiction movies, and the weather.  And vice versa.

(Do remember that your site’s stats may reflect its current behavior instead of your potential audience.  If your site is already broken past the point of usefulness in Safari, then you’re going to see very low Safari numbers.  Make sure that you’re comparing apples to apples, and only compare the numbers in your access logs for browsers that can already use the site.)

As for the related question of “at what percentage level do I decide a browser isn’t worth bothering about”—well, that’s really up to you, isn’t it?  I certainly can’t tell you when it’s worthwhile to stop worrying about IE5.0, or Netscape 4.7, or Mosaic 1.2.  I know what I think is appropriate for the sites I work on—and the process of finding the answer is different for every site.  It has to be, because every site is different.

Now, if you want to share your user demographics with anyone who wants a peek, hey, have fun with that.  If data exhibitionism is your thing, who am I to judge?  Just don’t pretend that the bits of data you’re exposing to the world are representative of everyone else’s, because I guarantee you that they are not.  As for anyone who happens to glance at your data: I hope they realize the same thing.


Unjustified Caption Text

Published 20 years, 2 weeks past

I just stumbled across a browser bug this evening that I thought you all might like to know about.  So we all know that IE6/Win supports text-align: justify, right?  Wrong.  Sorry, but it’s the truth: IE6 does not fully support text-align: justify.  True, it mostly supports that declaration, but if you apply said declaration to a caption element, guess what?  You get centered, non-justified text instead.  It’s very much as though, in the case of caption elements, IE6 replaces justify with center.

I just thought I’d pass this along in case it might help anyone else avoid some furious head-scratching.  And, I’m sorry to say, I don’t know of a workaround.  If anyone finds one, please leave a comment to that effect.

This is a perfect illustration of how difficult it is to do comprehensive CSS testing.  Every CSS support chart I’ve ever seen has marked down IE6 as supporting justified text; I mean, why wouldn’t they?  It clearly did so… for the specific test cases used to create that support chart.  The odds of a test page including a caption element are pretty vanishingly small, unless of course we’re talking about a test page that includes every single XHTML element in existence.  And to test every element known for every property-value combination… well, I’ve talked about that before.  No need to trample the same ground even flatter.


Unbreaking the Web

Published 20 years, 3 weeks past

While I was in Florida with my family visiting both sets of parents, Tristan Nitot published an article titled “How Microsoft can support CSS2 without breaking the Web“.  In it, Tristan points to a comment made by Gary Schare, Director of Windows Product Management at Microsoft, which was:

We could change the CSS support and many other standards elements within the browser rendering platform. But in doing so, we would also potentially  break a lot of things.

(from Microsoft Windows Exec Talks IE, Firefox)

Tristan then goes on to refute this line of thinking.  Generally speaking, I’m entirely in agreement with him.  (As a disclaimer, Tristan and I worked together as members of the Netscape Standards Evangelism Team, and Tristan asked me for feedback on his article before it was published.)

Here’s the thing: in the Windows world, Explorer already significantly upgraded its standards support four different times.  The most recent such upgrade was called IE6.  That was the version that first added DOCTYPE switching to IE/Win.  At that time, there were a great many changes made to the standards support, nearly every one for the better.  For example, in standards mode, you could no longer throw around unitless numbers and have them interpreted as pixels, because that violated the CSS specification.  You couldn’t set a height or width for an inline non-replaced element, because that too was incorrect.  The interpretation of font-size keywords was changed to reflect the CSS specification instead of the HTML font-sizing regime.  The box model was altered to follow CSS instead of the old IE way.  In short, there was all kinds of stuff in there that would “break a lot of things”.

The Web rather steadfastly declined to be broken.  Oh, sure, there were pages whose layout was altered—not many, thanks to the way DOCTYPE switching was implemented, but they were out there.  Anyone who was relying on the IE/Win way of doing things but used a DOCTYPE that triggered standards mode (say, for example, a HTML 4.01 Transitional DOCTYPE with URI) ended up with a “broken” page.  These problems were fixed by their authors, and that was that.  I remember a number of forum posts about how “IE6 broke my design”, and the posts that helped those authors address the problem.  In the case of old, unmaintained pages, they stayed broken, but odds are that next to nobody cared.  Regardless, it isn’t exactly a point of major concern on any radars I’ve seen in the last three years.

Furthermore, IE6 fixed a number of parsing bugs that existed in previous versions.  One of those was the bug on which Tantek Çelik’s “Box Model Hack” depended.  However, the parsing bug was fixed in both quirks and standards modes, so the BMH utterly failed to work in IE6 no matter what DOCTYPE you used.  That actually did break quite a few layouts, if I remember correctly.  I also remember the day I discovered that they’d fixed the parsing bug in both standards and quirks modes.  I swore at my monitor for a moment, and then actually thought about it.  I realized that the inconvenience of removing a few CSS hacks, or at worst changing to different hacks, was a pittance to pay in comparison to the advances IE6 had made in terms of increased standards support.

So I fixed a few style sheets, tossed a hack out of my mental toolbox, and got on with my life.  I contend that exactly the same thing would happen if a service pack were to add increased standards support to IE.

This is particularly true given that most of what IE should add would be, well, additions.  As in, things that IE doesn’t even try to support now, and so almost nobody uses them.  Think generated content.  Think attribute selectors.  Think fixed positioning.  These are all things that, if they were added to IE, would break almost no pages at all.  In fact, they’d make a small number of pages work better in IE.

For that incredibly small number of pages that would break (for whatever value of “break” you care to name) with improved standards support in IE6, I’m willing to bet that nearly all of them would get fixed right away.  Why?  Because they would be pages maintained by authors who actually want to use standards and care about doing things right.

Now, there is one area where I think the IE team would have to be careful about adding support, and that’s selectors.  A lot of hide-from-IE CSS hacks these days are based on its failure to support the child selector; in fact, I use these a few places in the S5 style sheets.  It is possible that adding support for child selectors to IE6 would be more harmful than beneficial.  I say it’s possible because I don’t know.  Nobody does—but Microsoft of all organizations has the ability to find out, and to act accordingly.  They have the funding, the personnel, the skills, and the customer base.  As Tristan said:

In its short, 2 1/2 year life, the Netscape Evangelism team helped  literally thousands of authors and administrators of web sites around the  world to improve their support for the W3C DOM and CSS Standards. If such a  small group with limited resources can help change the web, imagine what  Microsoft could do with its resources if it only tried.

Indeed.

Granted, the net stands still for no one, not even Microsoft.  There have already been, and continue to be, efforts to graft better standards support onto IE despite itself:  projects like PNG transparency fixes, whatever:hover, and IE7 take Microsoft’s proprietary behaviors and use them to make it easier to use open standards.  (I adore the poetry of that.)  The people behind those projects are already doing what Microsoft is apparently afraid to do, and they demonstrate why improving standards does not mean breaking the Web.

There’s one other point to consider.  If IE/Win improved its standards support in any meaningful way, believe me when I say that the news would be shouted from the Web site of every standards advocate in the known universe.  Nobody responsible for standards-oriented pages could avoid hearing about it.  Any problems would be quickly explained, and adjustments made.  Life would not only go on, but be better for developers and designers.

To sum up: the “more standards will break stuff” argument just doesn’t fly any more.  Microsoft can figure out what to do that won’t break pages, and there’s a ton of things that are new-to-IE, the implementation of which will no more break pages than did the image toolbar.  In cases that might cause breakage, Microsoft can determine—with community help, if they were to ask for it—how to minimize breakage while maximizing benefit.  To claim that possible Web page breakage prevents Microsoft from increasing standards support makes about as much sense as to claim that possible program breakage prevents them from ever changing or improving their operating system.

Despite this, I don’t have  much hope that we’ll see any improvements before Longhorn debuts.  I think that’s a shame, because I remember when the IE team was gung-ho about standards.  There were a number of very smart people who understood why standards were important, and were committed to doing their best to support standards in IE—not just on the Macintosh, but for Windows as well.

I do hope for Microsoft’s sake that those days return.  Because the Web continues to move, and if they just stand there promising that everything will be better in Longhorn, they may well find themselves left behind.


Browse the Archive

Earlier Entries

Later Entries