There’s been a lot more opportunities for just thinking outside the very strict design parameters that we usually work with. Then just thinking about not just designing for any particular device but a particular context. Because even if you are on a mobile phone, if you’re viewing something on a subway in the morning versus in the middle of the office at 10:30, your environment sort of dictates how you want to interface with a particular piece of content. That’s really what’s been firing me up as a long term proponent of flexible designs, letting the user dictate the experience. I sort of see that as the next step, thinking beyond the desktop.
Ethan Marcotte and I talk about the third edition of Designing with Web Standards and discuss the future of the web—a podcast by Paul Boag, with full transcription.
THE DEATHS of Leslie Harpold and Brad Graham, in addition to being tragic and horrible and sad, have highlighted the questionable long-term viability of blogs, personal sites, and web magazines as legitimate artistic and literary expressions. (Read this, by Rogers Cadenhead.)
Cool URIs don’t change, they just fade away. When you die, nobody pays your hosting company, and your work disappears. Like that.
Now, not every blog post or “Top 10 Ways to Make Money on the Internet” piece deserves to live forever. But there’s gold among the dross, and there are web publications that we would do well to preserve for historical purposes. We are not clairvoyants, so we cannot say which fledgling, presently little-read web publications will matter to future historians. Thus logic and the cultural imperative urge us to preserve them all. But how?
The death of the good in the jaws of time is not limited to internet publications, of course. Film decays, books (even really good ones) constantly go out of print, digital formats perish. Recorded music that does not immediately find an audience disappears from the earth.
Digital subscriptions were supposed to replace microfilm, but American libraries, which knew we were racing toward recession years before the actual global crisis came, stopped being able to pay for digital newspaper and magazine descriptions nearly a decade ago. Many also (even fancy, famous ones) can no longer collect—or can only collect in a limited fashion. Historians and scholars have access to every issue of every newspaper and journal written during the civil rights struggle of the 1960s, but can access only a comparative handful of papers covering the election of Barack Obama.
Thanks to budget shortfalls and format wars, our traditional media, literature, and arts are perishing faster than ever before. Nothing conceived by the human mind, except Heaven and nuclear winter, is eternal.
Still, when it comes to instant disposability, web stuff is in a category all its own.
Unlike with other digital expressions, format is not the problem: HTML, CSS, and backward-compatible web browsers will be with us forever. The problem is, authors pay for their own hosting.
(There are other problems: the total creative output of someone I follow is likely distributed across multiple social networks as well as a personal site and Twitter feed. How to connect those dots when the person has passed on? But let’s leave that to the side for the moment.)
A suggestion for a business. Sooner or later, some hosting company is going to figure out that it can provide a service and make a killing (as it were) by offering ten-, twenty-, and hundred-year packets of posthumous hosting.
A hundred years is not eternity, but you are not Shakespeare, and it’s a start.
In April of 2009, in a post every web designer, publisher, or business person should read, Joshua Schachter told how URL shortening services like TinyURL and Bit.ly came to be, and why the latest ones were so addictive. (Missing from Joshua’s account of their utility is the benefit URL shorteners can provide when sharing an otherwise obscenely long link on the printed page.)
The prescient post concludes that, despite their benefits, such services ultimately harm the web, decreasing clarity while increasing the odds of linkrot and spam:
[S]hortening services add another layer of indirection to an already creaky system. A regular hyperlink implicates a browser, its DNS resolver, the publisher’s DNS server, and the publisher’s website. With a shortening service, you’re adding something that acts like a third DNS resolver, except one that is assembled out of unvetted PHP and MySQL, without the benevolent oversight of luminaries like Dan Kaminsky and St. Postel.
There are three other parties in the ecosystem of a link: the publisher (the site the link points to), the transit (places where that shortened link is used, such as Twitter or Typepad), and the clicker (the person who ultimately follows the shortened links). Each is harmed to some extent by URL shortening.
There’s more, and you should read it all.
One of Joshua’s recommendations to minimize some of the harm is that websites do their own URL shortening instead of relying on middlemen. I’ve done some of that here, via the ShortURL plug-in for WordPress. Thus I use zeldman.com/x/48 instead of a much longer URL to notify my friends on Twitter about a new comment on an oldish thread. Likewise, zeldman.com/x/49 redirects to yesterday’s big post, “Write When Inspired.”
Rolling your own mini-URLs lessens the chance that your carefully cultivated links will rot if the third-party URL shortening site goes down or goes out of business, as is happening to tr.im, a URL shortener that is pulling the plug because it could neither monetize nor sell its service.
tr.im is now in the process of discontinuing service, effective immediately….
No business we approached wanted to purchase tr.im for even a minor amount.
There is no way for us to monetize URL shortening — users won’t pay for it — and we just can’t justify further development since Twitter has all but annointed bit.ly the market winner.
The Short URL Plugin for WordPress installs automatically. It provides simple statistics, telling you how many times a link has been clicked, sets up redirects automatically, allows you to choose a custom link style, and more. You’re not limited to shortening your own URLs, although that’s mainly how I use it; you can also shorten third-party URLs, turning your site into a miny TinyURL. I’ve used this plugin for months, with nothing but joy in its cleverness and usability.
Over the weekend, as thoughtful designers gathered at Typecon 2009 (“a letterfest of talks, workshops, tours, exhibitions, and special events created for type lovers at every level”), the subject of web fonts was in the air and on the digital airwaves. Worthwhile reading on web fonts and our other recent obsessions includes:
Responding to a question I raised here in comments on Web Fonts Now, for Real, Richard Fink explains the thinking behind Ascender Corp.’s EOT Lite proposal . The name “EOT Lite” suggests that DRM is still very much part of the equation. But, as Fink explains it, it’s actually not.
EOT Lite removes the two chief objections to EOT:
it bound the EOT file, through rootstrings, to the domain name;
it contained MTX compression under patent by Monotype Imaging, licensed by Microsoft for this use.
Essentially, then, an “EOT Lite file is nothing more than a TTF file with a different file extension” (and an unfortunate but understandable name).
A brief, compelling read for a published spec that might be the key to real fonts on the web.
Where does all of this net out? For @ilovetypography, “While we’re waiting on .webfont et al., there’s Typekit.”
(We announced Typekit here on the day it debuted. Our friend Jeff Veen’s company Small Batch, Inc. is behind Typekit, and Jason Santa Maria consults on the service. Jeff and Jason are among the smartest and most forward thinking designers on the web—the history of Jeff’s achievements would fill more than one book. We’ve tested Typekit, love its simple interface, and agree that it provides a legal and technical solution while we wait for foundries to standardize on one of the proposals that’s now out there. Typekit will be better when more foundries sign on; if foundries don’t agree to a standard soon, Typekit may even be the ultimate solution, assuming the big foundries come on board. If the big foundries demur, it’s unclear whether that will spell the doom of Typekit or of the big foundries.)
Applauding HTML 5’s introduction of semantic page layout elements (“Goodbye div soup, hello semantic markup”), author Jeff Starr shows how HTML 5 facilitates cleaner, simpler markup, and explains how CSS can target HTML 5 elements that lack classes and IDs. The piece ends with a free, downloadable goodie for WordPress users. (The writer is the author of the forthcoming Digging into WordPress.)
Web Fonts Now, for Real: David Berlow of The Font Bureau publishes a proposal for a permissions table enabling real fonts to be used on the web without binding or other DRM. — 16 July 2009
Web Fonts Now (How We’re Doing With That): Everything you ever wanted to know about real fonts on the web, including commercial foundries that allow @font-face embedding; which browsers already support @font-face; what IE supports instead; Håkon Wium Lie, father of CSS, on @font-face at A List Apart; the Berlow interview at A List Apart; @font-face vs. EOT; Cufón; SIFR; Cufón combined with @font-face; Adobe, web fonts, and EOT; and Typekit, a new web service offering a web-only font linking license on a hosted platform; — 23 May 2009
HTML 5 is a mess. Now what? A few days ago on this site, John Allsopp argued passionately that HTML 5 is a mess. In response to HTML 5 activity leader Ian Hickson’s comment here that, “We don’t need to predict the future. When the future comes, we can just fix HTML again,” Allsopp said “This is the only shot for a generation” to get the next version of markup right. Now Bruce Lawson explains just why HTML 5 is “several different kind of messes.” Given all that, what should web designers and developers do about it? — 16 July 2009
Web Standards Secret Sauce: Even though Firefox and Opera offered powerfully compelling visions of what could be accomplished with web standards back when IE6 offered a poor experience, Firefox and Opera, not unlike Linux and Mac OS, were platforms for the converted. Thanks largely to the success of the iPhone, Webkit, in the form of Safari, has been a surprising force for good on the web, raising people’s expectations about what a web browser can and should do, and what a web page should look like. — 12 July 2009
In Defense of Web Developers: Pushing back against the “XHTML is bullshit, man!” crowd’s using the cessation of XHTML 2.0 activity to condescend to—or even childishly glory in the “folly” of—web developers who build with XHTML 1.0, a stable W3C recommendation for nearly ten years, and one that will continue to work indefinitely. — 7 July 2009
XHTML DOA WTF: The web’s future isn’t what the web’s past cracked it up to be. — 2 July 2009
Joshua Schachter explains how URL shorteners like TinyURL, bit.ly, etc., originally created to prevent long URLs from breaking in 1990s e-mail clients, and now used primarily as a means of monetizing someone else’s content, are bad:
They “add another layer of indirection to an already creaky system, [making what] used to be transparent … opaque,” slowing down web use by adding needless lookups, and potentially disguising spam.
Shorteners “steal search juice” from the original publishers. (For example, with the Digg bar and Digg short URL, your content makes Digg more valuable and your site less valuable; the more content you create, the richer you make Digg.)
“A new and potentially unreliable middleman now sits between the link and its destination. And the long-term archivability of the hyperlink now depends on the health of a third party.”
Anyone who creates web content should read Joshua’s post. I’m sold and will dial way back on my use of the zeldman.com short URL. The question remains, what to do when you need to paste a long, cumbersome link into a 140-character service like Twitter. (If you do nothing, Twitter itself will shorten the link via TinyURL.)
Uh, wait, the problem is that some comments are not thoughtful. And I guess you can’t turn off comments in blogging software (except in all the blogging platforms that are already out there in the marketplace). But all those many long-established blogging platforms aside, I guess, like maybe in some new as-yet unreleased blogging tool, you might not be able to turn comments off. And then you’d have to, like, endure that some comments might not be thoughtful. So clearly you should just use Twitter and not write a blog because people can’t respond to you on Twitter. (What? They can?)
Well maybe the reason you’re not supposed to blog is that you won’t get rich blogging, because Calacanis did, so I guess he used up all the rich. Sorry, no more rich to go around. I mean, what’s the point of expressing yourself if there is no immediate rich to be had?
In conclusion, Twitter, Flickr, Calacanis and Scoble. Which proves you can’t have a blog and also use Twitter. Or maybe you can have a blog and use Twitter but you shouldn’t because comments, Scoble, rich.
Paul Boutin is usually a good writer. I’m not sure what happened here. Paul, when do we stop talking about web content exclusively in terms of narrow platforms and shallow, self-interested goals? When do we stop saying x makes y irrelevant? When do we stop reducing the web to a vulgar and trivial competition between head boys, and start appreciating it as a maturing medium for real thought and expression?
Q. I’m searching for your archive. Whenever I find a really good blog, I like to start at the beginning so I can understand better some of what you’re talking about. And I can’t find any link to your archives.
A. Thanks for writing. I started my site in 1995. There weren’t blogging tools back then, hence there aren’t archives in the sense you are describing. I published via hand-coded HTML until around 2004, when I began using WordPress. All my pre-WordPress content is still online; you just have to keep hitting the “PREVIOUS” button to get to it. Sorry about that.
Q. i have been using your son of moto [blogger template] to build my blogspots. why do i have to have two empty, wide, side fields? pls take a look at the above reference blog. i have to put all the content in the middle, rather narrow field.
A. We regret that we cannot provide technical support for templates we designed in 2004. Please check Blogger’s Help pages and see if they answer your concerns.
Bastardized, corrupted versions of these templates—versions we did not design, based on our work but not done by us—show up all over the web. We don’t know if these bastardized, corrupted versions are authorized (i.e. we don’t know if the republishers paid a licensing fee to Google, who commissioned the templates in the first place). Millions of people use these templates, or unauthorized hacks of these templates. If you need help changing the templates to suit your needs, kindly contact your service provider.
The original templates are part of the 2004 standards-based redesign of Blogger on which we and others toiled. Google paid the least money any of us had ever received on a web design job. But we would have done the work for free. It was all about creating web-standards-based templates—about getting standards out there in a big way: a way only a product with as many users as Blogger, and an owner as powerfully influential as Google, could assure. Finances were beside the point. The reward was making standards-based stuff for millions of people to use and enjoy.
Four years on, we still get a warm feeling out of having worked on the project. But that’s not all we get. Several times a week, we get e-mails from people who want to alter our templates but lack technical know-how. We regret that we cannot debug the style sheets of the universe.
[tags]moto, son of moto, blogger, templates[/tags]
OUR PERSONAL SITES, once our primary points of online presence, are becoming sock drawers for displaced first-person content. We are witnessing the disappearance of the all-in-one, carefully designed personal site containing professional information, links, and brief bursts of frequently updated content to which others respond via comments. Did I say we are witnessing the traditional personal site’s disappearance? That is inaccurate. We are the ones making our own sites disappear.
Obliterating our own readership and page views may not be a bad thing, but let’s be sure we are making conscious choices.
Interactive art director Jody Ferry’s site is a perfect example of the deeply decentralized personal page. I use the term “page” advisedly, as Jody’s site consists of a single page. It’s a fun, punchy page, bursting with personality, as intriguing for what it hides as what it reveals. Its clarity, simplicity, and liquidity demonstrate that Jody Ferry does indeed practice what the site’s title element claims: Interactive Art Direction and User Experience Design. All very good.
It could almost be the freshened-up splash page of a late 1990s personal site, except that the navigation, instead of pointing inward to a contact page, resume, blog, link list, and photos, points outward to external web services containing those same things. Mentally insert interactive diagram here: at left is a 1990s site whose splash page links to sub-pages. Structurally, its site map is indistinguishable from an org chart, with the CEO at the top, and everyone else below. At right, to re-use the org chart analogy, a site like Jody’s is akin to a single-owner company with only virtual (freelance) employees. There is nothing below the CEO. All arrows point outward.
Most personal sites are not yet as radically personal-content-outsourced as Jody’s, and certainly not every personal site will go this way. (Jody’s site might not even be this way tomorrow, and, lest it be misunderstood, I think Jody’s site is great.) But many personal sites are leaning this way. Many so inclined are currently in an interim state not unlike what’s going on here at zeldman.com:
There are blog posts here, but I post Tweets far more frequently than I write posts. (For obvious reasons: when you’re stuck in an airport, it’s easier to send a 140-character post via mobile phone and Twitter than it is to write an essay from that same airport. Or really from anywhere. Writing is hard, like design.) To connect the dots, I insert my latest Tweet in my sidebar. I have more readers here than followers at Twitter, but that could change. Are they the same readers? Increasingly, to the best of my knowledge, there are people who follow me on Twitter but do not read zeldman.com (and vice-versa). This is good (I’m getting new readers) and arguably maybe not so good (my site, no longer the core of my brand, is becoming just another piece of it).
Like nearly everyone, I outsource discoverable, commentable photography to Flickr.com instead of designing my own photo gallery like my gifted colleagues Douglas Bowman and Todd Dominey. Many bloggers now embed mini-bits of their Flickr feeds in their site’s sidebars. I may get around to that. (One reason I haven’t rushed to do it is that most of my Flickr photos are hidden behind a “friends and family” gateway, as I mainly take pictures of our kid.) Photography was never what this site was about, so for me, using Flickr is not the same as outsourcing the publication of some of my content.
As I’ve recently mentioned, links, once a primary source of content (and page views) here, got offloaded to Ma.gnolia a while back. From 1995 until a few years ago, every time I found a good link, an angel got his wings and I got page views. My page views weren’t, brace yourself for an ugly word, monetized, so all I got out of them was a warm feeling—and that was enough. Now my site is, brace yourself again, monetized, but I send my readers to Ma.gnolia every time I find a link. Go figure.
I’m not trying to get rid of my readers, nor are you trying to shake off yours. In the short term, including Flickr, Twitter, and Ma.gnolia or De.licio.us feeds sends traffic both ways—out to those services, but also back to your site. (Remember when some of us were afraid RSS would cost us our readers? It did and it didn’t. With RSS, good writers gain readers while often losing traditional page views. But that’s another story.) I’ve certainly found new websites by going to the Twitter profile pages of people who write funny or poignant Tweets. Behind a great Flickr photo may be a great designer whose site you might not have found if not for first seeing that photo.
But outsourcing the publication of our own content has long-term implications that point to more traffic for the web services we rely on, and less traffic and fewer readers for ourselves.
This is not necessarily a bad thing. Not every person who designs websites needs to run a personal magazine on top of all their other responsibilities. If your goal in creating a personal site way back when was to establish an online presence, meet other people who create websites, have fun chatting with virtual friends, and maybe get a better job, well, you don’t need a deep personal site to achieve those goals any more.
But if world domination is your goal, think twice before offloading every scrap of you.
An authorized Belorussian translation of this article, ???? ??????????? ????, appears on designcontest.com.
The WCAG Samurai Errata for Web Content Accessibility Guidelines 1.0 are published as an alternative to WCAG 2. “You may comply with WCAG 2, or with these errata, or with neither, but not with both at once.” Published 26 February 2008. Read the intro first.
Free Mac OS X application lets you share files fast. Drag any file or folder onto the Dockdrop dock icon, then choose how you want to send it. Dockdrop uploads it and puts a URL for your upload on the clipboard, ready for pasting into an email, chat program or website.
Without my permission, Technorati has stuck my photo and its logo in the sidebar of my site’s front page.
Technorati, when it works, provides useful services to blogs and their readers, such as the ability to track third-party responses to a post. (Google Blog Search works the same street, and refreshes more frequently.)
Technorati also indexes “authority,” which is its word for popularity as determined by the number of Technorati users who mark your site as a favorite.
You could configure the script to show your picture and Technorati’s logo but you didn’t have to, and I chose not to.
Technorati called the script an “embed.”
In the last few days, Technorati apparenty converted its “embeds” to “widgets.”
Widgets do more than embeds, and I’m sure they’ll delight some blog owners. But I am not delighted. I wasn’t asked, or even notified. Through investigation (AKA random clicking) I found the widgets page and “customized” my widget not to show my photo and Technorati’s logo (i.e. I manually opted out of something I had previously already opted out of).
Except the opt-out didn’t take. My photo and Technorati’s logo are still stuck in my front page’s sidebar.
I’ll give Technorati a few days to clear its cache (or its head). If there’s still junk in my sidebar come Monday, then it’s adios, Technorati.
IN 1995, I RECKONED everyone would teach themselves HTML and start homesteading on the web. When that didn’t happen, I spent three years on a free tutorial I figured would give the world the push it needed. It didn’t.
I was an early blogger and a late user of blogging software because, why did anybody need blogging software? Wrong. Always wrong.
In 2004, some colleagues and I contributed to the “new” Blogger. We were excited by the thought of bringing well-designed, easy-peasy, standards-compliant web publishing tools to millions of people. Now everyone can do this, we thought. And millions did.
But not everyone, it turns out, wants to blog. Blogging is hard. There’s, like, thoughts and stuff that you have to come up with, even if someone else handles the whole “what should my blog be like and what should it do and how should it be organized and what should it look like” part.
No, what most people were really looking for—or at least, what most people have responded to since such things became available—were web gizmos as easy as farting and as addictive as cigarettes. “Social software.” “Web 2.0.” Swimming pools, movie stars.
All this to preface the unremarkable yet strange to those who know me fact that yesterday I signed up for Facebook. And spent several hours messing with it. And checked it this morning before making coffee, before making breakfast for The Wife and I, before bringing The Child her strawberry milk.
Facebook is pretty. It works with Ma.gnolia. It works with Twitter. In theory it works with iLike, except that you can’t add an existing iLike account to Facebook, which is lame and sucks and iLike’s fault, and the fact that I care and am bothering to share such trivia shows how deeply assimilated I have become over the past 24 hours, eight of which I spent sleeping.
As when I joined Twitter, the first thing I noticed was how many of my friends and colleagues were already there ahead of me. Why none of them had invited me to join, bastards, I leave to their consciences, not that I’m bitter. They redeemed themselves by responding within an hour or less when I asked to be their “friends,” not that I’m keeping score.
I don’t need more friends and I don’t need more contacts. I avoided most of the first-generation social software that was all about Rolodex building, and only gave in to the main one everyone knows and which I shall not name when a loved old client of mine invited me to join his network. Since I made that mistake, I get lots more mail, and lots more mail is something else I don’t need.
But I design interfaces so I’m supposed to know about this stuff. That’s the rationale behind my spending hours of billable time adjusting my Facebook preferences. The real reason, of course, for all this stuff, is that it provides a way to blow off work you should be doing, while creating the illusion that you are achieving something. At least in most offices, you can’t masturbate at your desk. But you can Tweet.