Skip to: site navigation/presentation
Skip to: Thoughts From Eric

Archive: 'Web' Category

Ramping Up

We were driving back home from our impromptu surprise family vacation in Tennessee, winding our way through the Appalachian Mountains, when I pointed out a long, steep ramp to nowhere branching off the side of the highway.  “What do you think it’s for?” I asked the kids.

They made some guesses, some quite clever, but none correct.  So I told them about runaway truck ramps and how they work.  I think they were vaguely interested for a few seconds; I got a well-isn’t-that-interesting grunt, which I’ll take as a win.  We swept on past, the kids went back to whatever they were doing before I’d interrupted them, and I kept my eyes on the road.

But I was still thinking about the runaway truck ramp, and how it’s a perfect physical example of designing for crisis.

I also wondered about the history of runaway ramps—when they were first implemented, and how many runaway vehicles crashed before the need was recognized and a solution found.  After I got home, I looked it up and discovered that ramps didn’t really exist until the 1970s or so.  Even if we assume that no vehicles lost control in the U.S. until the Eisenhower Interstate System was established in the 1950s (just go with it), that’s still two decades of what were probably some pretty horrible crashes, before a solution was implemented.

This is not to say that the ramps are a perfect solution.  A runaway vehicle can certainly crash before reaching the next ramp, and using a ramp is likely to damage the vehicle even under the best of circumstances.  A badly-designed ramp can be almost as dangerous as no ramp at all.  Still, a solution exists.

I feel like web design is at the pre-ramp phase.  We’ve created a huge, sprawling system that amplifies commerce and communication, but we haven’t yet figured out how to build in some worst-case-scenario features that don’t interfere with the main functioning of the system.  We’ve laid down the paths and made some of them look pretty or even breathtaking, but we’re still not dealing with the crashes that happen when an edge case comes onto our stretch of the road.

I’m trying really hard to avoid “information superhighway” clichés here, by the way.

I’ve been pondering whether to incorporate this particular example into my 2015 talk, “Designing for Crisis”—much will depend on how the talk stands after I go back through it one more time to tighten it up, and start rehearsing again.  If there’s room and a good hook, I’ll add it in as a brief illustration.  If not, that’s okay too.  It’s still given me another way to look at designing for crisis, and how that topic fits into the broader theme that the Facebook imbroglio brought to light.

I’m still trying to get a good handle on what the broader theme is, exactly.  “Designing for Crisis” is a part of it, but just a part.  Several people have told me I should turn that talk into a book, but it never quite felt like a book.  Sure, I could have stretched it to fill a book, but something was missing, and I knew it.  I thought there was a hole in the idea that I needed to identify and fill; instead, the idea was filling a hole in a context I hadn’t seen.

Now I have.  It will take some time to see all of it, or even just more of it, but at least now I know it’s there and waiting to be explored and shared.

Well, That Escalated Quickly

This post is probably going to be a little bit scattered, because I’m still reeling from the overwhelming, unexpected response to the last post.  I honestly expected “Inadvertent Algorithmic Cruelty” to be read by maybe two or three hundred people over the next couple of weeks, all of them friends, colleagues, and friends who are colleagues.  I hoped that I’d maybe give a few of them something new and interesting to think about, but it was really mostly just me thinking out loud about a shortcoming in our field.  I never expected widespread linking, let alone mainstream media coverage.

So the first thing I want to say: I owe the Year in Review team in specific, and Facebook in general, an apology.  No, not the other way around.  I did get email from Jonathan Gheller, product manager of the Year in Review team at Facebook, before the story starting hitting the papers, and he was sincerely apologetic.  Also determined to do better in the future.  But I am very sorry that I dropped the Internet on his head for Christmas.  He and his team didn’t deserve it.

(And yes, I’ve reflected quite a bit on the irony that I inadvertently made their lives more difficult by posting, after they inadvertently made mine more difficult by coding.)

Yes, their design failed to handle situations like mine, but in that, they’re hardly alone.  This happens all the time, all over the web, in every imaginable context.  Taking worst-case scenarios into account is something that web design does poorly, and usually not at all.  I was using Facebook’s Year in Review as one example, a timely and relevant foundation to talk about a much wider issue.

The people who I envisioned myself writing for—they got what I was saying and where I was focused.  The very early responses to the post were about what I expected.  But then it took off, and a lot of people came into it without the context I assumed the audience would have.

What surprised and dismayed me were the…let’s call them uncharitable assumptions made about the people who worked on Year in Review.  “What do you expect from a bunch of privileged early-20s hipster Silicon Valley brogrammers who’ve never known pain or even want?” seemed to be the general tenor of those responses.

No.  Just no.  This is not something you can blame on Those Meddling Kids and Their Mangy Stock Options.

First off, by what right do we assume that young programmers have never known hurt, fear, or pain?  How many of them grew up abused, at home or school or church or all three?  How many of them suffered through death, divorce, heartbreak, betrayal?  Do you know what they’ve been through?  No, you do not.  So maybe dial back your condescension toward their lived experiences.

Second, failure to consider worst-case scenarios is not a special disease of young, inexperienced programmers.  It is everywhere.

As an example, I recently re-joined ThinkUp, a service I first used when it was install-yourself-and-good-luck alpha ware, and I liked it then.  I’d let it fall by the wayside, but the Good Web Bundle encouraged me to sign up for it again, so I did.  It’s a fun service, and it is specifically designed to “show how well you’re using your social networks at a more human level,” to quote their site.

So I started getting reports from ThinkUp, and one of the first was to tell me about my “most popular shared link” on Twitter.  It was when I posted a link to Rebecca’s obituary.

“Popular” is maybe not the best word choice there.

Admittedly, this is a small wrinkle, a little moment of content clashing with context, and maybe there isn’t a better single word than “popular” to describe “the thing you posted that had the most easily-tracked response metrics”.  But the accompanying copy was upbeat, cheery, and totally didn’t work.  Something like, “You must be doing something right—people loved what you had to say!”

This was exactly what Facebook did with Year in Review: found the bit of data that had the most easily-tracked response metrics.  Facebook put what its code found into a Year in Review “ad”.  ThinkUp put what its code found into a “most popular” box.  Smaller in scale, but very similar in structure.

I’m not bringing this up to shame ThinkUp, and I hope I haven’t mischaracterized them here.  If they haven’t found solutions yet, I know they’re trying.  They really, really care about getting this right.  In fact, whenever I’ve sent them feedback, the responses have been fantastic—really thoughtful and detailed.

My point is that ThinkUp is a product of two of the smartest and most caring people I know, Gina Trapani and Anil Dash.  Neither of them comes anywhere close to fitting the Young Brogrammer stereotype; they are, if anything, its antithesis, in both form and deed.  And yet, they have fallen prey to exactly the same thing that affected the Year in Review team: a failure to anticipate how a design decision that really worked in one way completely failed in another, and work to handle both cases.  This is not because they are bad designers: they aren’t.  This is not because they lack empathy: they don’t.  This is not because they ignored their users: they didn’t.  This is such a common failure that it’s almost not a failure any more.  It just… is.

We need to challenge that “is”.  I’ve fallen victim to it myself.  We all have.  We all will.  It will take time, practice, and a whole lot of stumbling to figure out how to do better, but it is, I submit, vitally important that we do.

Inadvertent Algorithmic Cruelty

I didn’t go looking for grief this afternoon, but it found me anyway, and I have designers and programmers to thank for it.  In this case, the designers and programmers are somewhere at Facebook.

I know they’re probably pretty proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so—a lot of people have used it to share the highlights of their years.  Knowing what kind of year I’d had, though, I avoided making one of my own.  I kept seeing them pop up in my feed, created by others, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.”  Which was, by itself, jarring enough, the idea that any year I was part of could be described as great.

Still, they were easy enough to pass over, and I did.  Until today, when I got this in my feed, exhorting me to create one of my own.  “Eric, here’s what your year looked like!”


A picture of my daughter, who is dead.  Who died this year.

Yes, my year looked like that.  True enough.  My year looked like the now-absent face of my little girl.  It was still unkind to remind me so forcefully.

And I know, of course, that this is not a deliberate assault.  This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party or whale spouts from sailing boats or the marina outside their vacation house.

But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year.

To show me Rebecca’s face and say “Here’s what your year looked like!” is jarring.  It feels wrong, and coming from an actual person, it would be wrong.  Coming from code, it’s just unfortunate.  These are hard, hard problems.  It isn’t easy to programmatically figure out if a picture has a ton of Likes because it’s hilarious, astounding, or heartbreaking.

Algorithms are essentially thoughtless.  They model certain decision flows, but once you run them, no more thought occurs.  To call a person “thoughtless” is usually considered a slight, or an outright insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves.

Where the human aspect fell short, at least with Facebook, was in not providing a way to opt out.  The Year in Review ad keeps coming up in my feed, rotating through different fun-and-fabulous backgrounds, as if celebrating a death, and there is no obvious way to stop it.  Yes, there’s the drop-down that lets me hide it, but knowing that is practically insider knowledge.  How many people don’t know about it?  Way more than you think.

This is another aspect of designing for crisis, or maybe a better term is empathetic design.  In creating this Year in Review app, there wasn’t enough thought given to cases like mine, or friends of Chloe, or anyone who had a bad year.  The design is for the ideal user, the happy, upbeat, good-life user.  It doesn’t take other use cases into account.

Just to pick two obvious fixes: first, don’t pre-fill a picture until you’re sure the user actually wants to see pictures from their year.  And second, instead of pushing the app at people, maybe ask them if they’d like to try a preview—just a simple yes or no.  If they say no, ask if they want to be asked again later, or never again.  And then, of course, honor their choices.

It may not be possible to reliably pre-detect whether a person wants to see their year in review, but it’s not at all hard to ask politely—empathetically—if it’s something they want.  That’s an easily-solvable problem.  Had the app been designed with worst-case scenarios in mind, it probably would have been.

If I could fix one thing about our industry, just one thing, it would be that: to increase awareness of and consideration for the failure modes, the edge cases, the worst-case scenarios.  And so I will try.

Note: There is a followup to this post that clarifies my original intent, among other things.

A slightly revised and updated version of this post was published at Slate.

Finding My Way

With presentations of “Designing for Crisis” at AEA Orlando and World Usability Day Cleveland now behind me, I’m getting into the process of reviewing and refining the talk for 2015.  This will be my talk at An Event Apart all throughout this year, making me one of the rare AEA speakers who won’t have a brand-new talk in 2015.  (We’ll have a mix of new and familiar faces, as we always try to do, and they’ll all be bringing new material to the stage.)

Even “Designing for Crisis” will have some new aspects to it, as I discover ways to strengthen it and loop in some new thoughts and discoveries.  As an example, I just recently had a great chat with Amy Cueva, who gave me some really sharp insights into how I can share the message even more effectively.  I expect that kind of iterative improvement to continue throughout the year, given how new the topic is to me, and possibly to everyone.  It’s been something of a surprise to have many people tell me it’s caused them to see their own work in a whole new light—even people working in fields where you might think they would already be on top of this.  I’m really excited to bring this talk to people at AEA, and elsewhere as opportunities arise.  I hope it will do some good in the world.

In parallel with that ongoing effort, I’m getting back to writing more than just the occasional blog post.  I’ve restarted work on the fourth edition of CSS: The Definitive Guide—details on that will be forthcoming just after the holidays.  I’m also starting to write down some of the thoughts and approaches in “Designing for Crisis”, as well as some nascent thoughts on network effects, responsibility, community, and guidance.  I’m also trying to teach myself git so I can push out public repositories of my CSS tests and some bits of code I’d like to release into the wild, but honestly that’s pretty slow going, because it’s always a fifth or sixth priority behind my family, working on AEA, refining and rehearsing the new talk, and writing.

(“Bits of code”.  SEE WHAT I DID THERE?)

Given everything that’s coming together, I really am looking forward to 2015 and a return to speaking and writing.  For painfully obvious reasons, I was pretty out of the loop for nearly all of 2014, not to mention the last half of 2013.  I tried to stay up to date, but it’s one thing to be in the middle of things, and quite another to observe things from a distance.  (The mosh pit never looks like it feels, you know?)  So in addition to all the other stuff, I’m working overtime to catch up, and that’s where I could really use some help from the community.

So, tell me: what did I miss?  What’s emerging that I should be (or should already have been) paying attention to, and what am I already behind the curve on?  What has you excited, and what sounds so awesome that you’re hungering to know more about it?  And maybe most important of all, where should I be going to get caught up?

All input welcome, whether here in the comments, or out there on les médias sociaux.  And thank you!

Blue Beanie Day 2014

This past Sunday was Blue Beanie Day, the annual celebration of web standards that was established by Douglas Vos, taking as his inspiration the cover of Jeffrey Zeldman’s field-defining book, Designing With Web Standards.  This year’s was the eighth annual celebration, and to mark the occasion, I replaced my purple-infused Twitter and Facebook avatars to sport blue beanies.

That’s how much web standards mean to me.

If you missed Blue Beanie Day—which, it being the Sunday of a major U.S. holiday weekend, many of you may well have—don’t let that stop you!  Drop a cerulean toque on your social-media avatars, make a quick status update about why, and wear your pride in your craft and your love of the web on your sleeve.  Head.  Whatever.

If you don’t have a beanie ready to hand, then here, feel free to use one of these.

image image image

Every day is web standards day, of course, but Blue Beanie Day comes but once a year.  It’s not too late to mark the occasion.  As Ethan says, toque ‘em if you got ‘em!

The Light of Other Days

Every day or three, I upload another batch of photos to Flickr, trying to work my way through the backlog and get caught up with the present.  This is a habit I enforce inconsistently, because I’m bad at maintaining regular habits even at the best of times.  That halfway explains the backlog.  When I do enforce it, my habit is to upload no more than 10 or 15 photos at a time, so that I can properly tag and geolocate them without having to invest hours in the process.  That explains the other half of the backlog.  Right now, as I write this, I’m about six weeks behind.

Which means that yesterday, I uploaded the first half of the pictures from Rebecca’s sixth birthday party.  It’s been over five weeks now since she died, but in the Flickrverse, she still has six days to live.  She’s still tired but essentially herself, riding the Rocket Car and eating mini-donuts and chasing bubbles and hula hooping and blowing out the candles on her half of the enormous Frozen-themed cake shared with Ruth, her best friend in the whole world, the girl who shares her initials and whose birthday is only a few days apart from hers.

She still doesn’t know, none of us know, that the experimental medicine has failed and the tumor has been growing unchecked for weeks, compressing normal brain matter and now only days away from killing her.  Just two days after her birthday party, an MRI will reveal the horrible truth, but in the Flickrverse, that day has not yet come.

Flickr and my laptop combine to become a digital slow glass, bathing me in the light of days past.  I look at those pictures, tag them, adorn them with metadata, sort some into albums, and all the while I remember how we felt that day.  We were worried, Kat and I, but we still had hope.  Everyone there still hoped that she’d find a way to survive, and that hope was not unreasonable.

And so the party was not a wake for a still-living child, but a joyful celebration of her life and the simple fact that she’d lived long enough and well enough to enjoy the party.  There had been times in the previous few weeks that we’d thought she wouldn’t make it that far.  Had we held the party six days later, on her actual birthday, as originally planned, she wouldn’t have.

We didn’t know that then, but I know that now.  As I witness those days past, trying to taste some trace of what life was like then, I also have the horrible foreknowledge of what will happen in the days to come.  I know without question that the MRI will happen, that the news will be dire.  That she will sink into herself and lose so much of what we fought so hard to preserve, and that it will be lost quickly, in the span of a few days.  That we will believe she is leaving us the day before she actually does, and be surprised when she wakes and has a semi-normal evening, believing when that happens that she has a week or two left.  That the next day, the week will end with her actual birthday, the day that shatters us, the day she dies.

Today or tomorrow, I’ll upload the second half of the party photos, and her birthday party will once more be over and that final week will once more begin.  I could stop there, just walk away from uploading forever, and a large part of me cries out to do exactly that—but doing so would arrest more than just the glacially slow expansion of my Flickr account.  If I allow myself to stop there, arrested in the days when we could still feel hope, it will be that much harder to reconcile the past and present.  Without that reconciliation, it is very likely I will never feel hope again.

For myself and my future, the future we were unable to bring her into but must inhabit anyway, I have to keep going.  I have to upload the photos of that last week, relive the horror and anguish, the moments I captured as well as the moments I didn’t but will never be able to forget.  I have to let her go again.

And so the light keeps coming through the slow glass we’ve built, emerging from distributed panes aglow with the light of other days, pushing closer and closer to the unwelcome present.

The Web At 25

The Web is celebrating its 25th anniversary today, taking as its starting point the March 1989 publication of “Information Management: A Proposal”.  I was honored to contribute a small greeting to the Greetings page over at The Web At 25.  Following on that, I wanted to add a few more words here, mostly about my own Web history, because the Web is nothing if not a vast collection of all of us sharing ourselves.

I was first exposed to the Web in mid- to late 1993 by my friend and (then) co-worker, Jim Nauer, and it instantly caught my imagination.  I’d worked on some hypertext systems before, including a summer spent on a DOS-based hypertext system whose name now escapes me that was used to mark up the Ohio Legal Code on CD-ROM for a publisher named Banks-Baldwin, now a division of Thomson Reuters.  This Web thing, though, this was something altogether different and more powerful.  By late fall I’d gotten my hands on a paper copy of the HTML 2.0 specification and on December 3th, 1993, I finished marking up my first document: the Incomplete Mystery Science Theater 3000 Episode Guide.

At the time, I was a hardware jockey for the Library Information Technologies department at Case Western Reserve University, swapping out bad SIMM chips in online catalog terminals and maintaining a database of equipment serial numbers.  So in my downtime between service calls and database updates, I had the freedom to install Mosaic betas and start surfing around to see what there was to be seen.  My increasing obsession with the Web eventually led me to become Webmaster of CWRU’s first “pure” Web site.  (Before that, there was an HTTP interface to our Gopher server, which was the first  And as part of that, I published tutorials and compatibility charts and spent a lot of time on Usenet and mailing lists dedicated to this new Web thing.

I do remember the moment that the Web blew me away a second time, and it’s a moment of total coincidence, which is of course why I remember it.  On April 3rd, 1996, I discovered (I forget exactly how) that CNN had a Web site, and I was astonished—a news network taking the Web seriously?  Really?  So I loaded it up, and the top headline was “RON BROWN KILLED IN PLANE CRASH” or words to that effect.  We turned on a radio, and there was nothing about the crash for at least an hour, maybe more, and of course newspapers wouldn’t have anything to say until morning, and I remember thinking: What is wrong with these other channels, that they’re so slow and unresponsive?  That was my first direct glimpse of the future of information velocity, something that permanently altered my instincts.

Over the years, the Web has obviously been good to me, and I’ve tried to be good to it in return.  The original Internet aesthetic of sharing what you know and making use of what others share, one that carried onto the early Web, has always resonated with me, as did the obvious simplicity (and thus robustness) of the Web itself.  As simple as possible, and no simpler; small pieces loosely joined; openness to all—these are principles I held dear and which the Web has always embodied.  Which means that the Web helped me maintain those principles, over these past two decades, by showing that they can and do work.

As I said in my greeting for The Web at 25:

The web is the most human information system we have ever seen and that may ever be, open to anyone with the interest to build something, gargantuan and riotous and everything we are and hope to be. It’s been a privilege just to witness its emergence, let alone play a part in it.

I suppose I could have just posted that here, and skipped the lengthy reminiscing, but what fun would that be?

Resurrected Landmarks

It was just last week, at the end of April, that CERN announced the rebirth of The Very First URL, in all its responsive and completely presentable glory.  If you hit the root level of the server, you get some wonderful information about the Web’s infancy and the extraordinary thing CERN did in releasing it, unencumbered by patent or licensing restrictions, into the world, twenty years ago.

That’s not at all minor point.  I don’t believe it overstates the case to say that if CERN hadn’t made the web free and open to all, it wouldn’t have taken over the net.  Like previous attempts at hypertext and similar information systems, it would have languished in a niche and eventually withered away.  There were other things that had to happen for the web to really take off, but none of them would have mattered without this one simple, foundational decision.

I would go even further and argue that this act infused the web, defining the culture that was built on top of it.  Because the medium was free and open, as was often the case in academic and hacker circles before it, the aesthetic of sharing freely became central to the web community.  The dynamic of using ideas and resources freely shared by others, and then freely sharing your own resources and ideas in return, was strongly encouraged by the open nature of the web.  It was an implicit encouragement, but no less strong for that.  As always, the environment shapes those who live within it.

It was in that very spirit that Dave Shea launched the CSS Zen Garden ten years ago this week.  After letting it lie fallow for the last few years, Dave has re-opened the site to submissions that make use of all the modern capabilities we have now.

It might be hard to understand this now, but the Zen Garden is one of the defining moments in the history of web design, and is truly critical to understanding the state of CSS before and after it debuted.  When histories of web design are written—and there will be—there will be a chapters titled things like “Wired, ESPN, and the Zen Garden: Why CSS Ended Up In Everything”.

Before the Zen Garden, CSS was a thing you used to color text and set fonts, and maybe for a simple design, not for “serious” layout.  CSS design is boxy and boring, and impossible to use for anything interesting, went the conventional wisdom.  (The Wired and ESPN designs were held to be special cases.)  Then Dave opened the gates on the Zen Garden, with its five utterly different designs based on the very same document…and the world turned.

I’m known to be a history buff, and these days a web history buff, so of course I’m super-excited to see both these sites online and actively looked after, but you should be too.  You can see where it all started, and where a major shift in design occurred, right from the comfort of your cutting-edge nightly build of the latest and greatest browsers known to man.  That’s a rare privilege, and a testimony to what CERN set free, two decades back.

October 2015