Posts in the Tech Category

Acid Redux

Published 16 years, 8 months past

So the feeds I read have been buzzing the past few days with running commentary of the WebKit and Opera teams’ race to be the first to hit 100/100 on Acid3, and then after that the effort to get a pixel-perfect match with the reference image.  Last I saw, Opera claimed to have gotten to 100 first but it looked like WebKit had gotten both with something publicly available, but I haven’t verified any of this for myself.  Nor do I have any particular plans to do so.

Because as lovely as it is to see that you can, in fact, get one or more browser implementation teams to jump in a precisely defined sequence through a series of cunningly (one might say sadistically) placed hoops, half of which are on fire and the other half lined with razor wire, it doesn’t strike me as the best possible use of the teams’ time and energy.

No, I don’t hate standards, though I may hate freedom (depends on who’s asking).  What I disagree with is the idea that if you cherry-pick enough obscure and difficult corners of a bunch of different specifications and mix them all together into a spicy meatball of difficulty, it constitutes a useful test of the specifications you cherry-picked.  Because the one does not automatically follow from the other.

For example, suppose I told you that WebKit had implemented just the bits of SMIL-related SVG needed to pass the test, and that in doing so they exposed a woefully incomplete SVG implementation, one that gets something like 2% pass rates on actual SMIL/SVG tests.  Laughable, right?  Yes, well.

Of course, that’s in a nightly build and they might totally support SMIL by the time the corresponding final version is released and we’ll all look back on this and laugh the carefree laugh of children in springtime.  Maybe.  The real point here is that the Acid3 test isn’t a broad-spectrum standards-support test.  It’s a showpiece, and something of a Potemkin village at that.  Which is a shame, because what’s really needed right now is exhaustive test suites for specifications– XHTML, CSS, DOM, SVG, you name it.  We’ve been seeing more of these emerge recently, but they’re not enough.  I’d have been much more firmly in the cheering section had the effort that went into Acid3 had gone into, say, an obssessively thorough DOM test suite.

I’d had this post in mind for a while now, really ever since Acid3 was released.  Then the horse race started to develop, and I told myself I really needed to get around to writing that post—and I got overtaken.  Well, that’s being busy for you.  It’s just as well I waited, really, because much of what I was going to say got covered by Mike Shaver in his piece explaining why Firefox 3 isn’t going to hit 100% on Acid3.  For example:

Ian’s Acid3, unlike its predecessors, is not about establishing a baseline of useful web capabilities. It’s quite explicitly about making browser developers jump… the Acid tests shouldn’t be fair to browsers, they should be fair to the web; they should be based on how good the web will be as a platform if all browsers conform, not about how far any given browser has to stretch to get there.

That’s no doubt more concisely and clearly stated than I would have managed, so it’s all for the best that he got to say it first.

By the by, I was quite intrigued by this part of Mike’s post:

You might ask why Mozilla’s not racking up daily gains, especially if you’re following the relevant bugs and seeing that people have produced patches for some issues that are covered by Acid3.

The most obvious reason is Firefox 3. We’re in the end-game of building what I really do believe is the best browser the web has ever known, and we expect to be putting it in the hands of more than 170 million users in a pretty short period of time. We’re still taking fixes for important issues, but virtually none of the issues on the Acid3 list are important enough for us to take at this stage. We don’t want to be rushing fixes in, or rushing out a release, only to find that we’ve broken important sites or regressed previous standards support, or worse introduced a security problem. Every API that’s exposed to content needs to be tested for compliance and security and reliability… We think these remaining late-stage patches are worth the test burden, often because they help make the web platform much more powerful, and reflect real-web compatibility and capability issues. Acid3’s contents, sadly, are not as often of that nature.

You know, it’s weird, but that seems really familiar, like I’ve heard or read something like that before.  Now if only I could remember…  Oh yeah!  It’s basically what the IE team said about not passing Acid2 when the IE7 betas came out, for which they were promptly excoriated.

Huh.

Well, never mind that now.  Of course it was a totally different set of circumstances and core motivations, and I’m sure there’s absolutely no parallel to be drawn between the two situations.  At all.

Returning to the main point here:  I’m a little bit sad, to tell the truth.  The original acid test was a prefect example of what I think makes for a good stress test.  Recall that the test’s original name, before it got shorthanded, was the “Box Model Acid Test”.  It was a test of CSS box model handling, including floats.  That’s all it was designed to do.  It did that fairly well for its time, considering it was part of a CSS1 test suite.  It didn’t try to combine box model testing with tests for PNG support, HTML parse error recovery, and DOM scripting.

To me, the ideal CSS test suite is one that has a bunch of basic property/value tests, like the ones I’ve been responsible for creating (1, 2), along with a bunch of acid tests for specific areas or concepts in that specification.  So an acidified CSS test suite would have individual acid tests for the box model, positioning, fonts, selectors, table layout, and so on.  It would not involve scripting or markup parsing (beyond what’s needed to handle selectors).  It would not use animated SVG icons.  Hell, it probably wouldn’t even use PNGs, except possibly alphaed PNGs when testing opacity and RGBA colors.  And maybe not even then.

So in a DOM test suite, you’d have one test page for each method or attribute, and then build some acid tests out of related bits (say, on an entire interface or set of closely related interfaces).  And maybe, at the end, you’d build an overarching acid test that rolled verything in the DOM spec into one fiendishly difficult test.  But it would be just about the DOM and whatever absolute minimum of other stuff you needed, like text rendering and maybe GIF support.  (Similarly, the CSS tests had to assume some basic HTML and CSS selector support, or else everything else fell down.)

And then, after all those test suites have been built up and a series of acid tests woven into them, with each one culminating in its own spec-spanning acid test, you might think about taking those end-point acid tests and slamming them all together into one super-ultra-hyper-mega acid test, something that even the xenomorphs from the Alien series would look at and say, “That’s gonna sting”.  That would be awesome.  But that’s not what we have.

I fully acknowledge that a whole lot of very clever thinking went into the construction of Acid3 (as was true of Acid2), and that a lot of very smart people have worked very hard to pass it.  Congratulations all around, really.  I just can’t help feeling like some broader and more important point has been missed.  To me, it’s kind of like meeting the general challenge of finding an economical way to loft broadband transceivers to an altitude of 25,000 feet (in order to get full coverage of large metropolitan areas while avoiding the jetstream) by daring a bunch of teams to plant a transceiver near the summit of Mount Everest—and then getting them to do it.  Progress toward the summit can be demonstrated and kudos bestowed afterward, but there’s a wider picture that seems to have been overlooked in the process.


Notacon: Not to be Missed

Published 16 years, 8 months past

In just under a couple of weeks, the fifth annual NOTACON will be held right here in beautiful Cleveland, Ohio.  You’re going, I know; you’re super-über-cool like that, and you don’t need to be reminded of your coolness.  But I’d like to mention the show here for posterity, so that our descendants will know just how completely they missed out.

Notacon straddles, like a Colossus built entirely out of recycled motherboards, backtech chips, and loops of soldering wire, the middle ground between regular conferences and BarCamps (though Notacon predates BarCamp by a couple of years).  It’s not free to attend, but it is very inexpensive.  What it lacks in slick advertising and corporate sponsors, it makes up ten times over in raw, unfiltered geekiness and fascinating material.  This is the kind of event where presenters will hold forth on the depths of digital security, the physics of wireless networking, homebrew chip architecture, the coolness of HyperCard, online society dynamics, and more.  There’s a running contest called Anything but Ethernet, where you get bonus points for having one of the links in your network architecture incorporate barbed wire.

Yeah.  It’s like that.

The speakers will be as wildly diverse as the audience.  The lead engineer for the C64 Direct-to-TV (a C64 in a joystick!); the man behind The Daily WTF; some of the folks putting out 2600 magazine; the woman behind CrochetMe.com; and many more.  I’ll be there as well, talking about the bleeding edge of CSS and web design, ripping apart some recent projects of mine at top speed while discussing where I think we’ll be in three years.  Plus Drew Curtis of FARK fame will be back, as he always is, this year sponsoring a FARK party.  The mind fairly boggles.  Boggles!

As you’re no doubt gathering by now, it’s hard to describe Notacon in a quick, concise summary—and that’s a big part of what makes it so awesome.  For my contemporaries: see you there!  To you future historians: okay, you missed out, but drop everything right now to find out when the next one is and I’ll see you there!


Drugs, Bugs, and IE8

Published 16 years, 8 months past

If there’s a downside to becoming a cyborg, it’s the aftermath.  I’m not talking about the dystopian corporate-state shenanigans: those are fully expected.  No, it’s the painkillers that really suck.  They basically do their job, but at the cost of mental acuity.  That is not a trade I’m happy to make.  Granted, there were some interesting physical hallucinations that came along for the ride, but that’s nowhere near enough to balance the scales.

Here’s what I mean on that last part.  At one point yesterday, lying in bed as I had been all day, I decided it was about time to straighten out my legs, which were crossed at the ankles and starting to feel a little funny.  When I sent the relevant signals to my legs, nothing really happened.  Slowly I came to realize that nothing was happening because my legs weren’t actually crossed at all.  Furthermore, it gradually dawned on me that if the sensoria I’d been getting had been correct, it would have to mean that my legs were not only crossed at the ankles, but also attached to my body backwards.

So anyway, I thought I’d write up some of my observations (thus far) regarding IE8 beta 1.  What?

I’m going to say basically the same thing I said about the first betas of IE7: test and report, but don’t fix.  That is to say, you should absolutely grab it and run it across all your own sites, and all your common destinations.  Find out what’s different, broken, or just plain strange.

But don’t start searching for workarounds.  Not yet.  Submit bug reports, yes.  Boil down the problems you hit to basic test cases and submit those, if you like.  (I do like, but I’ve got kind of a history with that sort of thing.)  Just don’t think that beta 1 represents what we’ll face in the final release.

No, I don’t have some sort of inside track; never have.  That conclusion simply seems obvious to me just by looking at how this beta acts.  For example, there’s no support at all for :first-line and :first-letter.  That’s not just a glitch.  That’s a lack of support for a CSS feature that’s been present for three major releases.  I just can’t see that omission persisting to final release.

Another problem I noticed is evident here on the home page of meyerweb.  In the sidebar, each list item has a left margin and negative text indentation, creating a classic “outdent”.  Like so:

#extra .panel li {margin-left: 1em; text-indent: -1em;}

In each of those list items is a link of some kind, usually text.  The fun part is this: the hanging outdent part of that text isn’t clickable.  So the first couple of letters of each sidebar link are inactive.  They’re colored properly, but do nothing if you try to click them.  If you click on the active part of a link, the focus outline only draws around the active part.  And, for bonus yay, scrolling the page will wipe away any outdents that are offscreen.  So as you scroll down the page, you end up with all the sidebar links having their first few letters chopped off.  Whoops.

Again, that’s something I just can’t see going unaddressed in the final release.

In both these cases, flipping IE8 back to IE7 mode makes the weirdness go away.

I’ve seen more serious problems on the wider web.  Google Maps is currently busted beyond any hope of usefulness in IE8, as many have reported.  Also, I came across a site where loading the home page just locked up IE8 completely.  I had to force-quit and relaunch.  Every time I hit that page, lockup.

Flipping to IE7 mode allowed me to browse the site without any trouble at all.

These things, taken together, have really driven something home for me: there really is a new rendering engine in there.  I don’t just mean in the sense of fixing and adding enough things that the behavior is different.  I mean that I believe there’s truly a whole new engine under the hood of IE8.  And if the Acid 2 results and public statements of the IE team are to be believed, there’s a whole new standards-based rendering engine under that hood.

That’s kind of a big deal in any event.  The last time I remember a browser with an extended release history replacing its old, creaky, grown-over-time, crap-piled-on-crap engine with (what the browser team felt was) a new, improved one was the transition from Netscape 4.5 to Netscape 6.0.  And remember how well that went?  Yee haw.

I really shouldn’t be surprised about this.  Chris Wilson, for example, used the exact words “our new layout engine” during the WaSP roundtable (transcript).  I guess I’d been assuming that was verbal shorthand for “our much-improved version of our old layout engine”.  I guess I was wrong.

So I would personally argue that this release was mislabelled.  This is not a beta release.  As far as I’m concerned, it’s an alpha, even under the kinds of old-school naming conventions I prefer.  I’m not going to go around calling it that, because that would just be unnecessarily confusing, but it’s how I’m going to think of it.

Now I’m wondering just how long it will be until final release, given the kinds of distances one usually sees between alpha and final.

Unfortunately, I just took the 6pm set of painkillers, so I’ll be wondering at about one-third speed.


Expressive Sculptor

Published 16 years, 8 months past

For those of you using Microsoft Expression Web, a free pre-release trial version of CSS Sculptor for Expression Web was announced by the WebAssist folks on Wednesday at MIX08.  So now you don’t have to put up with those snooty Dreamweaver users throwing you the mëtäl hörns every chance they get—throw ’em right back!  Røck!

If you’re curious about CSS Sculptor, I posted in some detail about it when it was first released in August 2007, and there’s of course plenty of enthusiastic copy about it on the WebAssist site.

One thing that’s different about the Expression version as compared to the Dreamweaver version is that it doesn’t have an “Apply” button to apply the input CSS ito the preview window.  Instead, changes are instantly reflected in the little preview.  It’ll be interesting to see how users react to that, since it could mean that the previewed design shatters as the CSS is updated, and then snaps back together upon further changes.  Is that good or bad tool usability?  Hard to say; it could scare people into undoing the shatter-change and never pushing forward, but it could also help users more quickly gain a deeper understanding of CSS by seeing how things come apart and then go back together.  I guess we’ll find out!


Principles and Legality

Published 16 years, 8 months past

I woke up this morning (duh DAAAH dah DUH) and yesterday’s announcement was the first thing on my mind.  No doubt it’ll be a recurrent topic, at least for a little while.

One of the takeaways is what this change demonstrates about the IE team:  standards is and was their preferred default.  If it weren’t, they just would have found a way to square the IE7-default behavior with the Interoperability Principles announced late last month (slightly tricky but entirely possible).  That they initially chose otherwise speaks volumes about the pressures they face internally, and their willingness to publicly change direction speaks volumes about their commitment to supporting standards.  While I’m sure community feedback informed their decision, they pretty much knew what the reaction would be from the get-go.  If that was going to be the deciding factor, they would’ve chosen differently up front.

So what drove that change?  I keep coming back to two things, both of which were explicitly mentioned in yesterday’s announcement.

The first is, perhaps obviously, the previously mentioned Interoperability Principles.  Head on over there and read Principle II, “Support for Standards”.  If that isn’t a solid foundation on which to build an internal case for change, I don’t know what is.  I’m wryly amused by the idea that the IE team used the Interoperability Principles as a way to batter their way out of the grip of those internal pressures I mentioned.  The former aikido student in me finds that very satisfying.  True, the Principles came under fire for being just another set of empty words, but it would seem that they can be used for at least some concrete good.

As for the second, there’s a phrase repeated between the two announcements that I didn’t quote yesterday because I was still pondering its meaning.  I’m still not certain about it, but having had a chance to sleep on it, my initial reading hasn’t changed, so I’m going to quote and comment on it now.  First, from the press release:

“While we do not believe there are currently any legal requirements that would dictate which rendering mode must be chosen as the default for a given browser, this step clearly removes this question as a potential legal and regulatory issue,” said Brad Smith, Microsoft senior vice president and general counsel.

And then in Dean’s IEblog post:

While we do not believe any current legal requirements would dictate which rendering mode a browser must use, this step clearly removes this question as a potential legal and regulatory issue.

Okay, so they’re on message.  And the message seems to be this: that Opera’s move to link IE development to the larger EU anti-trust investigation bore fruit.  I was highly critical of that move, and unless I’m seriously misreading what I see here, I was wrong.  I’m still no fan of the tone that was used in announcing the move, but that’s window dressing.  Results matter most.

Speaking of Opera, there’s another side to all this that I find quite interesting.  So far, the reaction to Microsoft’s announcement has been overwhelmingly positive.  The sense I’ve picked up is, “Hooray! IE will act like browsers always have, and the problem is solved!”.

But is it?  The primary objection raised by Opera and several members of the community was that version targeting is an anti-competitive move, one which will force browser makers like Opera and authors of JavaScript libraries to support an ever-increasing and complex web (sorry) of rendering-engine behaviors in the market leader.  So far as I can tell, the change in default behavior does next to nothing to address that objection.  The various versions will still be there and still invoke-able by any page author who so chooses.  Yes, the default will be better for authors, but I don’t see how things get any better for Opera, Firefox, Safari, jQuery, Prototype, et. al.

Perhaps I’ve missed something basic (“Again!” shouts the chorus).  If so, what?  If not, then why all the hosannas?


Meta-change

Published 16 years, 8 months past

Now here’s something I didn’t expect to see when I woke up this morning:

Microsoft Expands Support for Web Standards: Company outlines new approach to make standards-based rendering the default mode in Internet Explorer 8, will work with Web designers and content developers to help with standards behavior transition.”

Seriously, that’s the title and subhead of Microsoft’s latest press release.

About halfway through, there’s this from Ray Ozzie:

…we have decided to give top priority to support for these new Web standards. In keeping with the commitment we made in our Interoperability Principles of being even more transparent in how we support standards in our products, we will work with content publishers to ensure they fully understand the steps we are taking and will encourage them to use this beta period to update their sites to transition to the more current Web standards supported by IE8.

See also the IEblog entry Microsoft’s Interoperability Principles and IE8, where Dean Hachamovitch says:

Microsoft recently published a set of Interoperability Principles. Thinking about IE8’s behavior with these principles in mind, interpreting web content in the most standards compliant way possible is a better thing to do.

We think that acting in accordance with principles is important, and IE8’s default is a demonstration of the interoperability principles in action.

In other words, the IE team seems to have used recent Microsoft PR efforts to their, and our, advantage.

I’m relieved and glad on the one hand, and a little worried on the other.  It’s not like the issues I discussed, or Jeffrey wrote about, have gone away.  It’s just that the way in which they’re handled by IE has shifted—which in some ways is a huge difference.

I think what worries me most is the possibility that when the public beta hits, there will be enough incompatibility problems that pushback from other constiuencies forces a change back to the original behavior.  I hope not.  I hope that what will happen is that any problems that come up will be addressed by spreading the news far and wide that there’s a simple one-line fix for those sites.

I’m glad that IE will act as browsers have always done, and default to the latest and greatest in the absence of any explicit direction to the contrary.  I’m doubly glad that the IE team is willing to do that, even knowing what they have to handle.  And I’m triply glad that the proposal was made in public ahead of time, with plenty of opportunity for debate, so that we could have a chance to weigh in and affect the browser’s behavior.


Common Bonds

Published 16 years, 9 months past

A List Apart #253 brings the issue of version targeting back into the limelight with opposing-view pieces by Jeremy Keith and Jeffrey Zeldman.  (And I love the “Editor’s Choice” on this issue, J. David Eisenberg’s “‘Forgiving’ Browsers Considered Harmful“.)

I’m not going to comment on the views presented; both gentlemen do a fine job.  What I do wish to add, or perhaps to restate, is an observation about everyone interested in, and thinking or arguing about, this topic:

We all care about the same thing.

We all want to advance web standards.  We all want browsers to improve their support.  We all want better and more advanced specifications.  We all want to reduce inconsistencies.  We all want a better web.

The disagreement is over how best to get there given the situation we face now, as well as how we perceive that current situation.  A recurrent metaphor for me is that we’re a large group of pioneers trying to chart the best course through an unknown country, and there is disagreement on which route entails the least risk to the whole group.  Cross the desert or the mountains?  Traverse a swampy delta or a hilly forest?  Move through this valley or that one?

Sometimes what binds us is strong enough that the few differences seem sharper by comparison.  That shouldn’t keep us from remembering what we have in common, and the importance of that commonality.


Manhattan Problem

Published 16 years, 9 months past

It’s not every day I uncover a case involving the botched theft of information about nuclear weapons.

Here’s how it went down: in the infosthetics feed was an entry about a video regarding nuclear stockpiles around the world and the effects of a nuclear explosion in New York City.  The video was produced by Chimp on a Chain for Good Magazine.

That’s a long-standing area of interest for me, so I watched it.  When I got to the New York City portion, something started to bother me beyond the obvious horror of the scenario.  The point of detonation, the explosive yield, the elapsed-time intervals, the radius distances—all seemed very familiar, like I’d seen them somewhere before.  And I had.

They were nearly all taken verbatim from the New York City scenario found at the Atomic Archive.  I could find only two differences.  The first is that the total death toll given in the video is slightly higher than that in the Atomic Archive’s scenario.  Otherwise, all the numbers matched up.

The second difference is really a major error on the part of the video’s makers: they dramatically under-represent the areas of damage.  For example, the ten-second ring’s (found at 2:33 in the movie) radius is labeled with the correct distance (2.5 miles) but the circle placed on the map is much, much too small to be 2.5 miles in radius.  The circle doesn’t even cover the breadth of Manhattan Island, whereas an accurate plot would have it stretch across the Hudson River on both sides into New Jersey and Long Island.  You can see this in part 5 of the Atomic Archive’s scenario, or on a HYDEsim plot of the same scenario.

The video seriously misrepresents the area of damage that would result from such an incident, making it appear much smaller than it would be, and I just can’t fathom how or why they would get that so wrong.  Even assuming they mixed up the meanings of “radius” and “diameter” doesn’t appear to explain it.  The ring distances shown correspond to a three-kiloton explosion at most, not to 150KT.

That’s the botched part.  So where’s the theft?  There is no credit whatsoever given in the video for the material’s source.  There is a reference to the Archive on the video’s page at Good in the “Resources” box, but the material in the video has been used without permission—I checked this with the custodian of the Archive—as required by the site’s policy.  Even if one could argue this is a case of not needing permission on non-profit grounds, attribution is still required.

It would almost be worth subscribing to Good so that 100% of my payment could go to the non-profit of my choice, as the site promises, except I’m limited to their choices of non-profits and none of them appear to be charged with educating magazine publishers or video artists about the niceties of copyright law, intellectual property rights, or even just plain common courtesy.


Browse the Archive

Earlier Entries

Later Entries