Why Standards Fail

Back in 2000, CSS co-creator Bert Bos set out to explain the W3C’s design principles—“to make explicit what the developers in the various W3C working groups mean when they invoke words like efficiency, maintainability, accessibility, extensibility, learnability, simplicity, [and] longevity….”

Eventually published in 2003, the essay, although ostensibly concerned with explaining W3C working group principles to the uninitiated, actually articulates the key principle that separates great design from the muck we normally wade through. It also serves as a warning to Bert’s fellow W3C wizards not to seek the dark magic of abstract purity at the expense of the common good. Tragically for these wizards, and for we who use their technologies, it is a warning some developers of W3C specifications continue to overlook.

Design is for people

In his introduction, Bert summarizes the humanistic value that is supposed to be at the core of every web standard:

Contrary to appearances, the W3C specifications are for the most part not designed for computers, but for people. … Most of the formats are in fact compromises between human-readability and computer efficiency….

But why do we want people to read them at all? Because all our specs are incomplete. Because people, usually other people than the original developers, have to add to them….

For the same reason we try to keep the specifications of reasonable size. They must describe a useful chunk of technology, but not one that is too large for an individual to understand.

Over the succeeding 25 web pages (the article is chunked out in pamphlet-sized pages, each devoted to a single principle such as “maintainability” and “robustness”) Bert clearly, plainly, and humbly articulates a series of rather profound ideas that are key to the web’s growth and that might apply equally admirably to realms of human endeavor beyond the web.

For instance, in the page entitled “Use What Is There,” Bert says:

The Web now runs on HTML, HTTP and URLs, none of which existed before the ’90s. But it isn’t just because of the quality of these new formats and protocols that the Web took off. In fact, the original HTTP was a worse protocol than, e.g., Gopher or FTP in its capabilities….

And that fact shows nicely what made the Web possible at all: it didn’t try to replace things that already worked, it only added new modules, that fit in the existing infrastructure. …

And nowadays (the year 2000), it may look like everything is XML and HTTP, but that impression is only because the “old” stuff is so well integrated that you forget about it: there is no replacement for e-mail or Usenet, for JPEG or MPEG, and many other essential parts of the Web.

He then warns:

There is, unfortunately, a tendency in every standards organization, W3C not excluded, to replace everything that was created by others with things developed in-house. It is the not-invented-here syndrome, a feeling that things that were not developed “for the Web” are somehow inferior. And that “we” can do better than “them.” But even if that is true, maybe the improvement still isn’t worth spending a working group’s resources on.

Shrinkage and seduction

In his gentle way, Bert seems to be speaking directly to his W3C peers, who may not always share his and Håkon‘s humanism. For, despite what designers new to CSS, struggling for the first time with concepts like “float” and the box model may think, Bert and Håkon designed the web’s layout language to be easy to learn, teach, implement, maintain, and (eventually) extend. They also designed CSS not to overwhelm the newcomer with advanced power at the cost of profound complexity. (“CSS stops short of even more powerful features that programmers use in their programming languages: macros, variables, symbolic constants, conditionals, expressions over variables, etc. That is because these things give power-users a lot of rope, but less experienced users will unwittingly hang themselves; or, more likely, be so scared that they won’t even touch CSS. It’s a balance.”)

This striving to be understood and used by the inexperienced is the underlying principle of all good design, from the iPhone to the Eames chair. It’s what Jared Spool would call usability and you and I may consider the heart of design. When anything new is created, be it a website, a service, or a web markup language, there is a gap between what the creator knows (which is everything about how it’s supposed to work), and what you and I know (which is nothing). The goal of design is to shrink this ignorance gap while seducing us into leaping across it.

What were once vices are now habits

You can see this principle at work in CSS, whose simplicity allowed us to learn it. Although we now rail against the limitations of CSS 1 and even CSS 2.1, what we are really complaining about is the slow pace of CSS 3 and the greater slowness with which browser makers (some more than others) adopt bits of it.

Note that at one time we would have railed against browser makers who implemented parts of a specification that was still under development; now we admire them. Note, too, that it has taken well over a decade for developers to understand and browsers to support basic CSS, and it is only from the perspective of the experienced customer who craves more that advanced web designers now cry out for immediate CSS 3 adoption and chafe against the “restrictions” of current CSS as universally supported in all browsers, including IE8.

If CSS had initially offered the power, depth, and complexity that CSS 3 promises, we would still be designing with tables or Flash. Even assuming a browser had existed that could demonstrate the power of CSS 3, the complexity of the specification would have daunted everyone but Eric Meyer, had CSS 1 not come out of the gate first.

The future of the future of standards

It was the practical simplicity of CSS that enabled browser engineers to implement it and tempted designers to use (and then evangelize) it. In contrast, it was the seeming complexity and detachment from practical workaday concerns that doomed XHTML 2, while XHTML 1.0 remains a valid spec that will likely still be working when you and I have retired (assuming retirement will be possible in our lifetime—but that’s another story).

And yet, compared to some W3C specs in progress, XHTML 2 was a model of accessible, practical, down-to-earth usability.

To the extent that W3C specifications remain modular, practical, and accessible to the non-PhD in computer science, they will be adopted by browser makers and the marketplace. The farther they depart from the principles Bert articulated, the sooner they will peter out into nothingness, and the likelier we are to face a crisis in which web standards once again detach from the direction in which the web is actually moving, and the medium is given over to incompatible, proprietary technologies.

I urge everyone to read “What is a Good Standard?“, and I thank my friend Tantek for pointing it out to me.

[tags]W3C, design, principles, bertbos, maintainability, accessibility, extensibility, learnability, simplicity, specs, standards, css, markup, code, languages, web, webdesign, webstandards, webdevelopment, essays[/tags]

Web fonts, HTML 5 roundup

Over the weekend, as thoughtful designers gathered at Typecon 2009 (“a letterfest of talks, workshops, tours, exhibitions, and special events created for type lovers at every level”), the subject of web fonts was in the air and on the digital airwaves. Worthwhile reading on web fonts and our other recent obsessions includes:

Jeffrey Zeldman Questions The “EOT Lite” Web Font Format

Responding to a question I raised here in comments on Web Fonts Now, for Real, Richard Fink explains the thinking behind Ascender Corp.’s EOT Lite proposal . The name “EOT Lite” suggests that DRM is still very much part of the equation. But, as Fink explains it, it’s actually not.

EOT Lite removes the two chief objections to EOT:

  • it bound the EOT file, through rootstrings, to the domain name;
  • it contained MTX compression under patent by Monotype Imaging, licensed by Microsoft for this use.

Essentially, then, an “EOT Lite file is nothing more than a TTF file with a different file extension” (and an unfortunate but understandable name).

A brief, compelling read for a published spec that might be the key to real fonts on the web.

Web Fonts—Where Are We?”

@ilovetypography tackles the question we’ve been pondering. After setting out what web designers want versus what type designers and foundries want, the author summarizes various new and old proposals (“I once heard EOT described as ‘DRM icing on an OpenType cake.’”) including Tal Leming and Erik van Blokland‘s .webfont, which is gathering massive support among type foundries, and David Berlow’s permissions table, announced here last week.

Where does all of this net out? For @ilovetypography, “While we’re waiting on .webfont et al., there’s Typekit.”

(We announced Typekit here on the day it debuted. Our friend Jeff Veen’s company Small Batch, Inc. is behind Typekit, and Jason Santa Maria consults on the service. Jeff and Jason are among the smartest and most forward thinking designers on the web—the history of Jeff’s achievements would fill more than one book. We’ve tested Typekit, love its simple interface, and agree that it provides a legal and technical solution while we wait for foundries to standardize on one of the proposals that’s now out there. Typekit will be better when more foundries sign on; if foundries don’t agree to a standard soon, Typekit may even be the ultimate solution, assuming the big foundries come on board. If the big foundries demur, it’s unclear whether that will spell the doom of Typekit or of the big foundries.)

The Power of HTML 5 and CSS 3

Applauding HTML 5’s introduction of semantic page layout elements (“Goodbye div soup, hello semantic markup”), author Jeff Starr shows how HTML 5 facilitates cleaner, simpler markup, and explains how CSS can target HTML 5 elements that lack classes and IDs. The piece ends with a free, downloadable goodie for WordPress users. (The writer is the author of the forthcoming Digging into WordPress.)

Surfin’ Safari turns up new 3-D HTML5 tricks that give Flash a run for its money

Just like it says.

Read more

  • Web Fonts Now, for Real: David Berlow of The Font Bureau publishes a proposal for a permissions table enabling real fonts to be used on the web without binding or other DRM. — 16 July 2009
  • Web Fonts Now (How We’re Doing With That): Everything you ever wanted to know about real fonts on the web, including commercial foundries that allow @font-face embedding; which browsers already support @font-face; what IE supports instead; Håkon Wium Lie, father of CSS, on @font-face at A List Apart; the Berlow interview at A List Apart; @font-face vs. EOT; Cufón; SIFR; Cufón combined with @font-face; Adobe, web fonts, and EOT; and Typekit, a new web service offering a web-only font linking license on a hosted platform; — 23 May 2009
  • HTML 5 is a mess. Now what? A few days ago on this site, John Allsopp argued passionately that HTML 5 is a mess. In response to HTML 5 activity leader Ian Hickson’s comment here that, “We don’t need to predict the future. When the future comes, we can just fix HTML again,” Allsopp said “This is the only shot for a generation” to get the next version of markup right. Now Bruce Lawson explains just why HTML 5 is “several different kind of messes.” Given all that, what should web designers and developers do about it? — 16 July 2009
  • Web Standards Secret Sauce: Even though Firefox and Opera offered powerfully compelling visions of what could be accomplished with web standards back when IE6 offered a poor experience, Firefox and Opera, not unlike Linux and Mac OS, were platforms for the converted. Thanks largely to the success of the iPhone, Webkit, in the form of Safari, has been a surprising force for good on the web, raising people’s expectations about what a web browser can and should do, and what a web page should look like. — 12 July 2009
  • In Defense of Web Developers: Pushing back against the “XHTML is bullshit, man!” crowd’s using the cessation of XHTML 2.0 activity to condescend to—or even childishly glory in the “folly” of—web developers who build with XHTML 1.0, a stable W3C recommendation for nearly ten years, and one that will continue to work indefinitely. — 7 July 2009
  • XHTML DOA WTF: The web’s future isn’t what the web’s past cracked it up to be. — 2 July 2009

[tags]@font-face, berlow, davidberlow, CSS, permissionstable, fontbureau, webfonts, webtypography, realtypeontheweb, HTML5, HTML4, HTML, W3C, WHATWG, markup, webstandards, typography[/tags]

HTML 5 is a mess. Now what?

A few days ago on this site, John Allsopp argued passionately that HTML 5 is a mess. In response to HTML 5 activity leader Ian Hickson’s comment here that, “We don’t need to predict the future. When the future comes, we can just fix HTML again,” Allsopp said “This is the only shot for a generation” to get the next version of markup right. Now Bruce Lawson explains just why HTML 5 is “several different kind of messes:”

  1. It’s a mess, Lawson says, because the process is a mess. The process is a mess, he claims, because “[s]pecifying HTML 5 is probably the most open process the W3C has ever had,” and when you throw open the windows and doors to let in the fresh air of community opinion, you also invite sub-groups with different agendas to create competing variant specs. Lawson lists and links to the various groups and their concerns.
  2. It’s a “spec mess,” Lawson continues, citing complaints by Allsopp and Matt Wilcox that many elements suffer from imprecise or ambiguous specification or from seemingly needless restrictions. (Methinks ambiguities can be resolved, and needless restrictions lifted, if the Working Group is open to honest, accurate community feedback. Lawson tells how to contact the Working Group to express your concerns.)
  3. Most importantly, Lawson explains, HTML 5 is a backward compatibility mess because it builds on HTML 4:

[I]f you were building a mark-up language from scratch you would include elements like footer, header and nav (actually, HTML 2 had a menu element for navigation that was deprecated in 4.01).

You probably wouldn’t have loads of computer science oriented elements like kbd,var, samp in preference to the structural elements that people “fake” with classes. Things like tabindex wouldn’t be there, as we all know that if you use properly structured code you don’t need to change the tab order, and accesskey wouldn’t make it because it’s undiscoverable to a user and may conflict with assistive technology. Accessibility would have been part of the design rather than bolted on.

But we know that now; we didn’t know that then. And HTML 5 aims to be compatible with legacy browsers and legacy pages. …

There was a cartoon in the ancient satirical magazine Punch showing a city slicker asking an old rural gentleman for directions to his destination. The rustic says “To get there, I wouldn’t start from here”. That’s where we are with HTML. If we were designing a spec from scratch, it would look much like XHTML 2, which I described elsewhere as “a beautiful specification of philosophical purity that had absolutely no resemblance to the real world”, and which was aborted by the W3C last week.

Damned if you do

The third point is Lawson’s key insight, for it illuminates the dilemma faced by HTML 5 or any other honest effort to move markup forward. Neither semantic purity nor fault-tolerance will do, and neither approach can hope to satisfy all of today’s developers.

A markup based on what we now know, and can now do thanks to CSS’s power to disconnect source order from viewing experience, will be semantic and accessible, but it will not be backward compatible. That was precisely the problem with XHTML 2, and it’s why most people who build websites for a living, if they knew enough to pay attention to XHTML 2, soon changed the channel.

XHTML 2 was conceived as an effort to start over and get it right. And this doomed it, because right-wing Nativists will speak Esperanto before developers adopt a markup language that breaks all existing websites. It didn’t take a Mark Pilgrim to see that XHTML 2 was a dead-end that would eventually terminate XHTML activity (although Mr Pilgrim was the first developer I know to raise this point, and he certainly looks prescient in hindsight).

It was in reaction to XHTML 2’s otherworldliness that the HTML 5 activity began, and if XHTML suffered from detachment from reality, HTML 5 is too real. It accepts sloppiness many of us have learned to do without (thereby indirectly and inadvertently encouraging those who don’t develop with standards and accessibility in mind not to learn about these things). It is a hodgepodge of semantics and tag soup, of good and bad markup practices. It embraces ideas that logically cancel each other out. It does this in the name of realism, and it is as admirable and logical for so doing as XHTML 2 was admirable and logical in its purity.

Neither ethereal purity nor benign tolerance seems right, so what’s a spec developer to do? They’re damned either way—which almost suggests that the web will be built with XHTML 1.0 and HTML 4.01 forever. Most importantly for our purposes, what are we to do?

Forward, compatibly

As the conversation about HTML 5 and XHTML has played out this week, I’ve felt like Regan in The Exorcist, my head snapping around in 360 degree arcs as one great comment cancels out another.

In a private Basecamp discussion a friend said,

Maybe I’m just confused by all the competing viewpoints, but the twisted knots of claim and counterclaim are getting borderline Lovecraftian in shape.

Another said,

[I] didn’t realize that WHATWG and the W3C’s HTML WG were in fact two separate bodies, working in parallel on what effectively amounts to two different specs [1, 2—the entire thread is actually worth reading]. So as far as I can tell, if Ian Hickson removes something from the WHATWG spec, the HTML WG can apparently reinsert it, and vice versa. [T]his… seems impossibly broken. (I originally used a different word here, but, well, propriety and all that.)

Such conversations are taking place in rooms and chatrooms everywhere. The man in charge of HTML 5 appears confident in its rightness. His adherents proclaim a new era of loaves and fishes before the oven has even finished preheating. His articulate critics convey a palpable feeling of crisis. All our hopes now hang on one little Hobbit. What do we do?

As confused as I have continually felt while surfing this whirlwind, I have never stopped being certain of two things:

  1. XHTML 1.0—and for that matter, HTML 4.01—will continue to work long after I and my websites are gone. For the web’s present and for any future you or I are likely to see, there is no reason to stop using these languages to craft lean, semantic markup. The combination of CSS, JavaScript, and XHTML 1.0/HTML 4.01 is here to stay, and while the web 10 years from now may offer features not supported by this combination of technologies, we need not fear that these technologies or sites built on them will go away in the decades to come.
  2. That said, the creation of a new markup language concerns us all, and an informed community will only help the framers of HTML 5 navigate the sharp rocks of tricky shoals. Whether we influence HTML 5 greatly or not at all, it behooves us to learn as much as we can, and to practice using it on real websites.

Read more

  • Web Fonts, HTML 5 Roundup: Worthwhile reading on the hot new web font proposals, and on HTML 5/CSS 3 basics, plus a demo of advanced HTML 5 trickery. — 20 July 2009
  • Web Standards Secret Sauce: Even though Firefox and Opera offered powerfully compelling visions of what could be accomplished with web standards back when IE6 offered a poor experience, Firefox and Opera, not unlike Linux and Mac OS, were platforms for the converted. Thanks largely to the success of the iPhone, Webkit, in the form of Safari, has been a surprising force for good on the web, raising people’s expectations about what a web browser can and should do, and what a web page should look like. — 12 July 2009
  • In Defense of Web Developers: Pushing back against the “XHTML is bullshit, man!” crowd’s using the cessation of XHTML 2.0 activity to condescend to—or even childishly glory in the “folly” of—web developers who build with XHTML 1.0, a stable W3C recommendation for nearly ten years, and one that will continue to work indefinitely. — 7 July 2009
  • XHTML DOA WTF: The web’s future isn’t what the web’s past cracked it up to be. — 2 July 2009

[tags]HTML5, HTML4, HTML, W3C, WHATWG, markup, webstandards[/tags]

HTML 5: nav ambiguity resolved

AN EMAIL from Chairman Hickson resolves an ambiguity in the nav element of HTML 5.

One of the new things HTML 5 sets out to do is to provide web developers with a standardized set of semantic page layout structures. For example, it gives us a nav element to replace structures like div class="navigation".

This is exciting, logical, and smart, but it is also controversial.

The controversy is best expressed in John Allsopp’s A List Apart article, Semantics in HTML 5, where he worries that the new elements may not be entirely forward-compatible, as they are constrained to today’s understanding of what makes up a page. An extensible mechanism, although less straightforward, would offer more room to grow as the web evolves, Allsopp argues.

We’re pretty sure Ian Hickson, the main force behind HTML 5, has heard that argument, but HTML 5 is proceeding along the simpler and more direct line of adding page layout elements. The WHAT Working Group Mr Hickson chairs has solicited designer and developer opinion on typical web page structures in order to come up with a short list of new elements in HTML 5.

nav is one of these elements, and its description in the spec originally read as follows:

The nav element represents a section of a page that links to other pages or to parts within the page: a section with navigation links. Not all groups of links on a page need to be in a nav element — only sections that consist of primary navigation blocks are appropriate for the nav element.

The perceived ambiguity was expressed by Bruce Lawson (AKA HTML 5 Doctor) thusly:

“Primary navigation blocks” is ambiguous, imo. A page may have two nav blocks; the first is site-wide naviagtion (“primary navigation”) and within-page links, eg a table of contents which many would term “secondary nav”.

Because of the use of the phrase “primary navigation block” in the spec, a developer may think that her secondary nav should not use a

Chairman Hickson has resolved the ambiguity by changing “primary” to “major” and by adding an example of secondary navigation using nav.

Read more

  • Web Fonts, HTML 5 Roundup: Worthwhile reading on the hot new web font proposals, and on HTML 5/CSS 3 basics, plus a demo of advanced HTML 5 trickery. — 20 July 2009
  • Web Standards Secret Sauce: Even though Firefox and Opera offered powerfully compelling visions of what could be accomplished with web standards back when IE6 offered a poor experience, Firefox and Opera, not unlike Linux and Mac OS, were platforms for the converted. Thanks largely to the success of the iPhone, Webkit, in the form of Safari, has been a surprising force for good on the web, raising people’s expectations about what a web browser can and should do, and what a web page should look like. — 12 July 2009
  • In Defense of Web Developers: Pushing back against the “XHTML is bullshit, man!” crowd’s using the cessation of XHTML 2.0 activity to condescend to—or even childishly glory in the “folly” of—web developers who build with XHTML 1.0, a stable W3C recommendation for nearly ten years, and one that will continue to work indefinitely. — 7 July 2009
  • XHTML DOA WTF: The web’s future isn’t what the web’s past cracked it up to be. — 2 July 2009

Translations

XHTML DOA WTF

Firefox developers who were initially alerted to a problem on this page, please view the Firefox test page and the page that explains its use. — JZ

The web’s future isn’t what the web’s past cracked it up to be. 1999: XML is the light and XHTML is the way. 2009: XHTML is dead—kind of.

From the W3C news archive for 2 July 2009:

XHTML 2 Working Group Expected to Stop Work End of 2009, W3C to Increase Resources on HTML 5

2009-07-02: Today the Director announces that when the XHTML 2 Working Group charter expires as scheduled at the end of 2009, the charter will not be renewed. By doing so, and by increasing resources in the Working Group, W3C hopes to accelerate the progress of HTML 5 and clarify W3C’s position regarding the future of HTML. A FAQ answers questions about the future of deliverables of the XHTML 2 Working Group, and the status of various discussions related to HTML. Learn more about the HTML Activity. (Permalink)

Please note that this thread has been updated with useful comments and links that help make sense of the emergence of HTML 5, the death of XHTML 2.0, and what designers and developers need to know about the present and future of web markup.

Read more

  • Web Fonts, HTML 5 Roundup: Worthwhile reading on the hot new web font proposals, and on HTML 5/CSS 3 basics, plus a demo of advanced HTML 5 trickery. — 20 July 2009
  • HTML 5: Nav Ambiguity Resolved. An e-mail from Chairman Hickson resolves an ambiguity in the nav element of HTML 5. What does that mean in English? Glad you asked! — 13 July 2009
  • Web Standards Secret Sauce: Even though Firefox and Opera offered powerfully compelling visions of what could be accomplished with web standards back when IE6 offered a poor experience, Firefox and Opera, not unlike Linux and Mac OS, were platforms for the converted. Thanks largely to the success of the iPhone, Webkit, in the form of Safari, has been a surprising force for good on the web, raising people’s expectations about what a web browser can and should do, and what a web page should look like. — 12 July 2009
  • In Defense of Web Developers: Pushing back against the “XHTML is bullshit, man!” crowd’s using the cessation of XHTML 2.0 activity to condescend to—or even childishly glory in the “folly” of—web developers who build with XHTML 1.0, a stable W3C recommendation for nearly ten years, and one that will continue to work indefinitely. — 7 July 2009

[tags]W3C, XML, XHTML, HTML, HTML5, WTF[/tags]

NSFW tag in HTML 5

A “Not Safe for Work” Tag has been proposed for HTML 5:

One of the most common descriptive notes people have to write using text when they post links or images to blogs, comments or anywhere in HTML is to say “this link is not safe for work” or simply “NSFW”. By adding the <NSFW> tag, this could be made much simpler and standardized. Browsers could then have an option to automatically hide all <NSFW> content. A tag is preferred to an attribute since it could then also be used around content and not just links.

Examples:
<nsfw><a href=”http://www.example.com”>Pics here!</a></nsfw>
<nsfw><img src=”badkitten.jpg”></nsfw>

(Via Bruce Lawson)

Drew McLellan of The Web Standards Project thinks it’s a nice idea that won’t work:

@brucel we looked into #nsfw in microformats. It’s an unworkable minefield. #

it’s used when linking to something that you might want to save until you get home. e.g. http://ampleboobies.info (NSFW) #

So a browser could conceivably be configured not to follow links or display content tagged nsfw. Sounds a good idea, but unworkable. #

The use of tags (rather than CSS and JavaScript) to hide or show content is an intriguing and controversial aspect of HTML 5. It’s intriguing because using a standard tag—instead of writing custom CSS and JavaScript that someone else may someday have to maintain—potentially simplifies web development and maintenance, bringing advanced techniques of content presentation to more sites for less money. It’s controversial because it sticks presentation and behavior back in markup, after we all just spent a decade separating site structure and semantics from behavior and presentation.

We’re going to be following these developments and trying to make buzzword-free sense of them for you.

[tags]standards, webstandards, HTML, HTML5, tags, NSFW, W3C[/tags]

ALA 275: Duty Now For The Future

What better way to begin 2009 than by looking at the future of web design? In Issue No. 275 of A List Apart, for people who make websites, we study the promise and problems of HTML 5, and chart a path toward mobile CSS that works.

Return of the Mobile Style Sheet

by DOMINIQUE HAZAËL-MASSIEUX

At least 10% of your visitors access your site over a mobile device. They deserve a good experience (and if you provide one, they’ll keep coming back). Converting your multi-column layout to a single, linear flow is a good start. But mobile devices are not created equal, and their disparate handling of CSS is like 1998 all over again. Please your users and tame their devices with handheld style sheets, CSS media queries, and (where necessary) JavaScript or server-side techniques.

Semantics in HTML 5

by JOHN ALLSOPP

The BBC’s dropping of hCalendar because of accessibility and usability concerns demonstrates that we have pushed the semantic capability of HTML far beyond what it can handle. The need to clearly and unambiguously add rich, meaningful semantics to markup is a driving goal of the HTML 5 project. Yet HTML 5 has two problems: it is not backward compatible because its semantic elements will not work in 75% of our browsers; and it is not forward compatible because its semantics are not extensible. If “making up new elements” isn’t the solution, what is?

[tags]HTML5, mobileCSS, webstandards, alistapart, johnallsopp, W3C, Dominique Hazael-Massieux[/tags]

Real type on the web?

A proposal for a fonts working group is under discussion at the W3C. The minutes of a small meeting held on Thursday 23 October include a condensed, corrected transcription of a discussion between Sampo Kaasila (Bitstream), Mike Champion (Microsoft), John Daggett (Mozilla), Håkon Wium Lie (Opera), Liam Quin (W3C), Bert Bos (W3C), Alex Mogilevsky (Microsoft), Josh Soref (Nokia), Vladimir Levantovsky (Monotype), Klaas Bals (Inventive Designers), and Richard Ishida (W3C).

The meeting started with a discussion of Microsoft’s EOT (Embedded OpenType) versus raw fonts. Bert Bos, style activity lead and co-creator of CSS, has beautifully summarized the relevant pros and cons discussed.

For those just catching up with the issue of real type on the web, here’s a bone-simple intro:

  1. CSS provides a mechanism for embedding real fonts on your website, and some browsers support it, but its use probably violates your licensing agreement with the type foundry, and may also cause security problems on an end-user’s computer.
  2. Microsoft’s EOT (based on the same standard CSS mechanism) works harder to avoid violating your licensing agreement, and has long worked in Internet Explorer, but is not supported in other browsers, is not foolproof vis-a-vis type foundry licensing rules, and may also cause PC security problems.

The proposed fonts working group hopes to navigate the technical and business problems of providing real fonts on the web, and in its first meeting came up with a potential compromise proposal before lunch.

Like everyone these days, the W3C is feeling a financial pinch, which means, if a real fonts working group is formed, its size and scope will necessarily be somewhat limited. That could be a good thing, since small groups work more efficiently than large groups. But a financial constraint on the number of invited experts could make for tough going where some details are concerned—and with typography, as with web technology, the details are everything.

I advise every web designer who cares about typography and web standards—that’s all of you, right?—to read the minutes of this remarkable first gathering, and to keep watching the skies.

[tags]web typography, typography, standards, webstandards, W3C, fonts, embedded, @fontface, EOT, workinggroup[/tags]

Let me hear your standards body talk

Jeremy Keith’s “Year Zero” beautifully explains why the W3C needs our backs, not our bullets.

The W3C is maddeningly opaque and its lieutenants will sometimes march madly into the sea, but it is all that stands between us and the whirlwind.

Slow the W3C will always be. Slow comes with the territory. If you glimpse even a hint of the level of detail required to craft usable standards, you’ll understand the slowness and maybe even be grateful for it—as you’d be grateful for a surgeon who takes his time while operating on your pancreas.

But the secrecy (which makes us read bad things into the slowness) must and will change. To my knowledge, the W3C has been working on its transparency problems for at least two years and making real change—just very slowly (there’s that word again) and incrementally and hence not at all obviously.

Key decision makers within the W3C intend to do much more, but they need to get their colleagues on board, and consensus-building is a bitch. A slow bitch.

If designers and developers are more aware of the problems than of the fact that the W3C is working to solve them, it’s because the W3C is not great at outreach. If they were great at outreach, we wouldn’t have needed a Web Standards Project to persuade browser makers to implement the specs and designers and developers to use them.

Designers sometimes compare the slow pace of standards with the fast pace of, say, Flash. But it is like comparing the output of the United Nations to the laws passed by a small benevolent dictatorship. When a company owns a technology, it can move fast. When a hundred companies that mistrust each other need to agree to every detail of a technology that only exists insofar as their phones and browsers support it, surprise, surprise, the pace is quite slow.

The W3C is working on its speed issues, too. It’s been forced to work on them by outside groups and by the success of microformats. But detailed interoperability of profound technologies no company owns is never going to happen half as fast as we’d like.

You want instant gratification, buy an iPod. You want standards that work, help. Or at least stop shouting.

[tags]w3c, standards, webstandards[/tags]

The King of Web Standards

In BusinessWeek, senior writer for Innovation & Design Jessie Scanlon has just published “Jeffrey Zeldman: King of Web Standards.” By any standards (heh heh), it is an accurate and well researched article. By the standards of technology journalism, it is exceptional. It might even help designers who aren’t named Jeffrey Zeldman as they struggle to explain the benefits of web standards to their bosses or clients. At the least, its publication in Business Week will command some business people’s attention, and perhaps their respect.

Avoiding the twin dangers of oversimplification that misleads, and pedantry that bores or confuses, Scanlon informs business readers about the markup and code that underlies websites; what went wrong with it in the early days of the web; and how web standards help ensure “that a Web site can be used by someone using any browser and any Web-enabled device.”

Scanlon communicates this information quickly, so as not to waste a business reader’s time, and clearly, without talking down to the reader. This makes her article, not merely a dandy clipping for my scrapbook, but a useful tool of web standards evangelism.

Contributing to the article with their comments are Jeff Veen, manager of user experience for Google’s web applications and former director of Hotwired.com; NYTimes.com design director, subtraction.com author, and grid-meister Khoi Vinh; and Dan Cederholm, founder of SimpleBits and author of Bulletproof Web Design. Dave Shea’s CSS Zen Garden features prominently as well, and rightfully so.

A right sexy slide show accompanies the article.

And lest a BusinessWeek article lull us into complacency, let us here note that the top 20 blogs as measured by Technorati.com fail validation—including one blog Happy Cog designed. (It was valid when we handed it off to the client.)

[tags]design, webdesign, standards, webstandards, webstandardsproject, WaSP, zeldman, jeffreyzeldman, veen, jeffveen, simplebits, dancederholm, bulletproof, khoivinh, subtraction, wired, hotwired, nytimes, happycog, zengarden, css, csszengarden[/tags]