Categories
Bandwidth Best practices Design Designers development DOM Ethan Marcotte HTML industry Markup Medium Off My Lawn! people Performance Responsive Web Design Standards State of the Web Tech The Essentials The Profession Usability UX Web Design Web Design History Web Standards XHTML

You’re welcome: cutting the mustard then and now.

EVERY TIME I hear a young web developer cite the BBC’s forward-thinking practice of “cutting the mustard,” by which they mean testing a receiving web device for certain capabilities before serving content, I remember when my team and I at The Web Standards Project invented that very idea. It’s a million web years ago, by which I mean fourteenish human years ago, so nobody remembers but me and some other long toothed grayhairs, plus a few readers of the first edition of Designing With Web Standards. But I like you, so I will tell you the story.

Back then in those dark times, it was common practice for web developers to create four or more versions of the same website—one for each browser then in wide use. It was also a typical (and complementary) practice to send server-side queries to figure out which browser was about to access a site’s content, and then send the person using that browser to the site version that was configured for her browser’s particular quirks, proprietary tags, and standards compliance failings.

The practice was called “browser detection.” Nobody but some accessibility advocates had ever questioned it—and the go-go dot-com era had no time or care for those folks.

But we at The Web Standards Project turned everything on its head. We said browsers should support the same standards instead of competing to invent new tags and scripting languages. We said designers, developers, and content folks should create one site that was accessible to everyone. In a world like that, you wouldn’t need browser detection, because every browser and device that could read HTML would be able to feast on the meat of your site. (And you’d have more meat to share, because you’d spend your time creating content instead of crafting multiple versions of the same site.)

To hasten that world’s arrival, in 2001 we launched a browser upgrade campaign. Those who participated (example participant here) employed our code and content to send their users the message that relatively standards-compliant browsers were available for every platform, and inviting them to try one. Because if more people used relatively standards-compliant browsers, then we could urge more designers and developers to create their sites with standards (instead of quirks). And as more designers and developers did that, they’d bump against still-unsolved standards compliance conundrums, enabling us to persuade browser makers to improve their standards compliance in those specific areas. Bit by bit, stone by stone, this edifice we could, and would, erect.

The code core of the 2001 browser upgrade campaign was the first instance of capability detection in place of browser detection. Here’s how it worked. After creating a valid web page, you’d insert this script in the head of your document or somewhere in your global JavaScript file:

if (!document.getElementById) {
window.location =
"http://www.webstandards.org/upgrade/"
}

We even provided details for various flavors of markup. In HTML 4 or XHTML 1 Transitional documents, it looked like this:

<script type="text/javascript" language="javascript">
<!-- //
if (!document.getElementById) {
window.location =
"http://www.webstandards.org/upgrade/"
}
// -->
</script>

In STRICT documents, you’d either use a global .js file, or insert this:

<script type="text/javascript">
<!-- //
if (!document.getElementById) {
window.location =
"http://www.webstandards.org/upgrade/"
}
// -->

You could also just as easily send visitors to an upgrade page on your own site:

if (!document.getElementById) {
window.location =
"http://www.yourdomain.com/yourpage.html"
}

Non-WaSP members (at the time) J. David Eisenberg, Tantek Çelik, and Jim Heid contributed technical advice and moral support to the effort. WaSP sysadmin Steven Champeon, the inventor of progressive enhancement, made it all work—under protest, bless him. (Steve correctly believed that all web content should always be available to all people and devices; therefore, in principle, he disliked the upgrade campaign, even though its double purpose was to hasten the arrival of truly standards-compliant browsers and to change front-end design and development from a disrespected world of hacks to a sustainable and professional craft. ((See what I did there? I’m still respectfully arguing with Steve in my head.)))

Discovering rudimentary DOM awareness or its absence in this fashion was the first time web developers had tested for capabilities instead of chasing the dragon in a perpetual and futile attempt to test for every possible browser flavor and version number. It was the grandparent, if you will, of today’s “cutting the mustard.” And it is analogous as well to the sensible responsive design practice of setting breakpoints for the content, instead of trying to set appropriate breakpoints for every possible device out there (including all the ones that haven’t been invented yet).

Which reminds us that the whole point of web standards was and is forward compatibility—to create content that will work not only in yesterday’s and today’s browsers and devices, but in all the wonderful devices that have yet to be invented, and for all the people of the world. You’re welcome.

—CHICAGO, Westin Chicago River Hotel, 1 September 2015


Hat tip: John Morrison

Categories
Blue Beanie Day books Browsers creativity CSS CSS3 Design Designers DOM DWWS E-Books editorial Happy Cog™ HTML HTML5 Ideas industry tweets twitter Voting W3C Web Design Web Design History Web Standards webtype XHTML Zeldman

Blue Beanie Day Haiku Contest – Win Prizes from Peachpit and A Book Apart

ATTENTION, web design geeks, contest fans, standards freaks, HTML5ophiles, CSSistas, grammarians, bookworms, UXers, designers, developers, and budding Haikuists. Can you do this?

Do not tell me I
Am source of your browser woes.
Template validates.

Write a web standards haiku (like that one), and post it on Twitter with the hashtag #bbd4 between now and November 30th—which happens to be the fourth international Blue Beanie Day in support of Web Standards.

Winning haikus will receive free books from Peachpit/New Riders (“Voices That Matter”) and A Book Apart.

Ethan Marcotte, co-author of Designing With Web Standards 3rd Edition and I will determine the winners.

Enter as many haikus as you like. Sorry, only one winning entry per person. Now get out there and haiku your heart out!

See you on Blue Beanie Day.

P.S. An ePub version of Designing With Web Standards 3rd Edition is coming soon to a virtual bookstore near you. Watch this space.

Categories
3e books CSS Design development DOM DWWS HTML5 javascript Publications Real type on the web Standards State of the Web Typography Usability User Experience Web Design Web Standards webfonts XHTML Zeldman

Am I Blue

Zeldman

Our classic orange avatar has turned blue to celebrate the release of Designing With Web Standards 3rd Edition by Jeffrey Zeldman with Ethan Marcotte. This substantial revision to the foundational web standards text will be in bookstores across the U.S. on October 19, 2009, with international stores to follow. Save 37% off the list price when you buy it from Amazon.com.

Short URL: zeldman.com/?p=2730

Categories
A List Apart An Event Apart Appearances architecture art direction Authoring Browsers bugs Career Chicago cities Code Community Compatibility conferences content content strategy creativity CSS Design development DOM downloads editorial Education engagement eric meyer events flickr Fonts Formats glamorous Happy Cog™ HTML HTML5 industry Information architecture Jason Santa Maria javascript Markup photography Real type on the web Scripting Search social networking speaking spec Standards State of the Web

Chicago Deep Dish

Dan Cederholm and Eric Meyer at An Event Apart Chicago 2009. Photo by John Morrison.

For those who couldn’t be there, and for those who were there and seek to savor the memories, here is An Event Apart Chicago, all wrapped up in a pretty bow:

AEA Chicago – official photo set
By John Morrison, subism studios llc. See also (and contribute to) An Event Apart Chicago 2009 Pool, a user group on Flickr.
A Feed Apart Chicago
Live tweeting from the show, captured forever and still being updated. Includes complete blow-by-blow from Whitney Hess.
Luke W’s Notes on the Show
Smart note-taking by Luke Wroblewski, design lead for Yahoo!, frequent AEA speaker, and author of Web Form Design: Filling in the Blanks (Rosenfeld Media, 2008):

  1. Jeffrey Zeldman: A Site Redesign
  2. Jason Santa Maria: Thinking Small
  3. Kristina Halvorson: Content First
  4. Dan Brown: Concept Models -A Tool for Planning Websites
  5. Whitney Hess: DIY UX -Give Your Users an Upgrade
  6. Andy Clarke: Walls Come Tumbling Down
  7. Eric Meyer: JavaScript Will Save Us All (not captured)
  8. Aaron Gustafson: Using CSS3 Today with eCSStender (not captured)
  9. Simon Willison: Building Things Fast
  10. Luke Wroblewski: Web Form Design in Action (download slides)
  11. Dan Rubin: Designing Virtual Realism
  12. Dan Cederholm: Progressive Enrichment With CSS3 (not captured)
  13. Three years of An Event Apart Presentations

Note: Comment posting here is a bit wonky at the moment. We are investigating the cause. Normal commenting has been restored. Thank you, Noel Jackson.

Short URL: zeldman.com/?p=2695

Categories
Advocacy Applications architecture Browsers Code Compatibility creativity CSS Design DOM Markup spec Standards State of the Web W3C Web Design Web Design History Web Standards wisdom

Why Standards Fail

Back in 2000, CSS co-creator Bert Bos set out to explain the W3C’s design principles—“to make explicit what the developers in the various W3C working groups mean when they invoke words like efficiency, maintainability, accessibility, extensibility, learnability, simplicity, [and] longevity….”

Eventually published in 2003, the essay, although ostensibly concerned with explaining W3C working group principles to the uninitiated, actually articulates the key principle that separates great design from the muck we normally wade through. It also serves as a warning to Bert’s fellow W3C wizards not to seek the dark magic of abstract purity at the expense of the common good. Tragically for these wizards, and for we who use their technologies, it is a warning some developers of W3C specifications continue to overlook.

Design is for people

In his introduction, Bert summarizes the humanistic value that is supposed to be at the core of every web standard:

Contrary to appearances, the W3C specifications are for the most part not designed for computers, but for people. … Most of the formats are in fact compromises between human-readability and computer efficiency….

But why do we want people to read them at all? Because all our specs are incomplete. Because people, usually other people than the original developers, have to add to them….

For the same reason we try to keep the specifications of reasonable size. They must describe a useful chunk of technology, but not one that is too large for an individual to understand.

Over the succeeding 25 web pages (the article is chunked out in pamphlet-sized pages, each devoted to a single principle such as “maintainability” and “robustness”) Bert clearly, plainly, and humbly articulates a series of rather profound ideas that are key to the web’s growth and that might apply equally admirably to realms of human endeavor beyond the web.

For instance, in the page entitled “Use What Is There,” Bert says:

The Web now runs on HTML, HTTP and URLs, none of which existed before the ’90s. But it isn’t just because of the quality of these new formats and protocols that the Web took off. In fact, the original HTTP was a worse protocol than, e.g., Gopher or FTP in its capabilities….

And that fact shows nicely what made the Web possible at all: it didn’t try to replace things that already worked, it only added new modules, that fit in the existing infrastructure. …

And nowadays (the year 2000), it may look like everything is XML and HTTP, but that impression is only because the “old” stuff is so well integrated that you forget about it: there is no replacement for e-mail or Usenet, for JPEG or MPEG, and many other essential parts of the Web.

He then warns:

There is, unfortunately, a tendency in every standards organization, W3C not excluded, to replace everything that was created by others with things developed in-house. It is the not-invented-here syndrome, a feeling that things that were not developed “for the Web” are somehow inferior. And that “we” can do better than “them.” But even if that is true, maybe the improvement still isn’t worth spending a working group’s resources on.

Shrinkage and seduction

In his gentle way, Bert seems to be speaking directly to his W3C peers, who may not always share his and Håkon‘s humanism. For, despite what designers new to CSS, struggling for the first time with concepts like “float” and the box model may think, Bert and Håkon designed the web’s layout language to be easy to learn, teach, implement, maintain, and (eventually) extend. They also designed CSS not to overwhelm the newcomer with advanced power at the cost of profound complexity. (“CSS stops short of even more powerful features that programmers use in their programming languages: macros, variables, symbolic constants, conditionals, expressions over variables, etc. That is because these things give power-users a lot of rope, but less experienced users will unwittingly hang themselves; or, more likely, be so scared that they won’t even touch CSS. It’s a balance.”)

This striving to be understood and used by the inexperienced is the underlying principle of all good design, from the iPhone to the Eames chair. It’s what Jared Spool would call usability and you and I may consider the heart of design. When anything new is created, be it a website, a service, or a web markup language, there is a gap between what the creator knows (which is everything about how it’s supposed to work), and what you and I know (which is nothing). The goal of design is to shrink this ignorance gap while seducing us into leaping across it.

What were once vices are now habits

You can see this principle at work in CSS, whose simplicity allowed us to learn it. Although we now rail against the limitations of CSS 1 and even CSS 2.1, what we are really complaining about is the slow pace of CSS 3 and the greater slowness with which browser makers (some more than others) adopt bits of it.

Note that at one time we would have railed against browser makers who implemented parts of a specification that was still under development; now we admire them. Note, too, that it has taken well over a decade for developers to understand and browsers to support basic CSS, and it is only from the perspective of the experienced customer who craves more that advanced web designers now cry out for immediate CSS 3 adoption and chafe against the “restrictions” of current CSS as universally supported in all browsers, including IE8.

If CSS had initially offered the power, depth, and complexity that CSS 3 promises, we would still be designing with tables or Flash. Even assuming a browser had existed that could demonstrate the power of CSS 3, the complexity of the specification would have daunted everyone but Eric Meyer, had CSS 1 not come out of the gate first.

The future of the future of standards

It was the practical simplicity of CSS that enabled browser engineers to implement it and tempted designers to use (and then evangelize) it. In contrast, it was the seeming complexity and detachment from practical workaday concerns that doomed XHTML 2, while XHTML 1.0 remains a valid spec that will likely still be working when you and I have retired (assuming retirement will be possible in our lifetime—but that’s another story).

And yet, compared to some W3C specs in progress, XHTML 2 was a model of accessible, practical, down-to-earth usability.

To the extent that W3C specifications remain modular, practical, and accessible to the non-PhD in computer science, they will be adopted by browser makers and the marketplace. The farther they depart from the principles Bert articulated, the sooner they will peter out into nothingness, and the likelier we are to face a crisis in which web standards once again detach from the direction in which the web is actually moving, and the medium is given over to incompatible, proprietary technologies.

I urge everyone to read “What is a Good Standard?“, and I thank my friend Tantek for pointing it out to me.

[tags]W3C, design, principles, bertbos, maintainability, accessibility, extensibility, learnability, simplicity, specs, standards, css, markup, code, languages, web, webdesign, webstandards, webdevelopment, essays[/tags]

Categories
Browsers bugs chrome Code Compatibility CSS Design development DOM Web Design Web Standards

Browser compatibility updates

DOM whiz and loyal-opposition/web standards advocate Peter-Paul Koch has been working overtime preparing detailed findings on CSS and DOM compatibility in modern browsers, including:

A Compatibility Master Table provides a snapshot of the status and results of all testing; Mobile Compatibility Tests are also in development.

It’s a great resource from an expert who really cares, and who has the time and expertise to find things out for the rest of us. Thanks, PPK!

%d bloggers like this: