You can look at Twitter as text messaging or as micro-blogging.
If it’s text-messaging, of concern only to your closest friends, then content such as “Dude, where are you? We’re in the mezzanine” is perfectly appropriate, and “Fish tacos FTW nom nom nom” is practically overachievement.
If it’s micro-blogging, then you may be obliged, like any writer, to consider your reader’s need for value.
Writers inform and enlighten. They create worlds, ideologies, and brochure copy.
In 140 characters, a good writer can make you laugh and a great one can make you march.
You thought I was going to say “cry.” That, too.
Not everyone who blogs is Dostoevsky, and with ten Twitterers for every blogger, the literary riches are spread thin.
The good writers are easier to discover thanks to tools like Favrd. (The best thing about Twitter is its unfulfilled potential. Some developers reach their highest level of attainment creating some of the many features Twitter didn’t come with.) Tools like Favrd also change the discourse: writers write differently when they think someone is reading, and self-consciously clever Twitterers have responded to Favrd by posting stuff that’s more likely to get favored—like directors playing to critics.
But nobody just follows on Twitter. Sure, you follow, but you also create. And you might consider that an obligation to occasionally create meaning, color, and richness.
I don’t view http as a medium for phone chatter. I don’t mean you can’t place phone calls over the internet—of course you can. I mean I’m old-fashioned enough (or have been doing this long enough) to view the web mostly as a publishing medium, with all the obligations that implies. So while I sometimes use Twitter as a homing device, I mainly try to think of it as the world’s smallest magazine, published by me.
[tags]writing, twitter, publishing, the web[/tags]
Content precedes design
Content precedes design. Design in the absence of content is not design, it’s decoration. [Cit.]
[tags]design, definitions, tweets[/tags]
Stick out your tongue
While employed at a famous New York advertising agency twenty years ago, a partner and I created a TV commercial touting an over-the-counter medicine client’s revolutionary new cold and flu remedy for young children.
Only when the shooting and shouting was over did we learn that the product did not, in fact, exist.
The commercial whose every creative detail we’d had to fight for was never going to run.
The client—the marketing side of a product development group—had a budget of $60,000 to spend. So they spent it, even though the R&D side of the product development group had not been able to deliver the product.
It was not a liquid medicine that needed to be measured. It was not a pill that needed to be chewed or swallowed. It was a pill that dissolved instantly on the tongue. Or would have been, if the engineers had been able to create it.
During weeks of presentation, the client rejected campaigns that would have caught the attention of the nation’s parents. The client bought a safe campaign that called less attention to itself, then set about systematically softening its edges. My partner and I wanted to cast like Fellini or Woody Allen. We brought in amazing children of various backgrounds, their faces rich in character. But the client picked cute blonde girls instead.
And so on. Every decision, however small, required approval. Everything was a fight. A ladies-and-gentlemanly fight. A fight that sounded like polite, mutually respectful discussion. A fight with invisible knives.
We won some and we lost some. For all the back-and-forth with the client, the resulting commercial wasn’t bad at all. The first few times anyone—even the guy delivering sandwiches—saw it, they laughed. Afterwards, they smiled. It could have been okay. It could have gotten my partner and me out of that agency and to a better one.
After the shoot was completed, the client told our account executive that the product did not exist and the commercial was never going to run.
The client had known this going in. So why didn’t they let us win more creative battles? Because they wanted something soft and safe to show the boss who had the power of life and death over their budget.
Why did the boss give them $60,000 to produce a commercial for a product that didn’t exist? Because that’s how corporations work. If they didn’t spend advertising dollars in 1988, they wouldn’t get ad dollars in 1989, when (in theory) they would finally have a product to advertise.
Governments, at least the ones I know of, work the same way. Since last night, the city of New York has been paving 34th Street in places it doesn’t need to be paved. Why do they do this? To justify the budget. In a better world, money set aside to pave streets that don’t need paving would be reassigned to something the city actually needs—like affordable housing, or medical care for poor or homeless people. But cities are corporations—that Mike Bloomberg is New York’s mayor merely confirms this—and few corporations are agile enough to rethink budgetary distributions on the basis of changing needs.
Last week, in an airport, on one of the inescapable widescreen TVs set to CNN (and always set to the wrong resolution) I saw a commercial for a revolutionary children’s medicine product that melts instantly on the tongue.
OUR PERSONAL SITES, once our primary points of online presence, are becoming sock drawers for displaced first-person content. We are witnessing the disappearance of the all-in-one, carefully designed personal site containing professional information, links, and brief bursts of frequently updated content to which others respond via comments. Did I say we are witnessing the traditional personal site’s disappearance? That is inaccurate. We are the ones making our own sites disappear.
Obliterating our own readership and page views may not be a bad thing, but let’s be sure we are making conscious choices.
Interactive art director Jody Ferry’s site is a perfect example of the deeply decentralized personal page. I use the term “page” advisedly, as Jody’s site consists of a single page. It’s a fun, punchy page, bursting with personality, as intriguing for what it hides as what it reveals. Its clarity, simplicity, and liquidity demonstrate that Jody Ferry does indeed practice what the site’s title element claims: Interactive Art Direction and User Experience Design. All very good.
It could almost be the freshened-up splash page of a late 1990s personal site, except that the navigation, instead of pointing inward to a contact page, resume, blog, link list, and photos, points outward to external web services containing those same things. Mentally insert interactive diagram here: at left is a 1990s site whose splash page links to sub-pages. Structurally, its site map is indistinguishable from an org chart, with the CEO at the top, and everyone else below. At right, to re-use the org chart analogy, a site like Jody’s is akin to a single-owner company with only virtual (freelance) employees. There is nothing below the CEO. All arrows point outward.
Most personal sites are not yet as radically personal-content-outsourced as Jody’s, and certainly not every personal site will go this way. (Jody’s site might not even be this way tomorrow, and, lest it be misunderstood, I think Jody’s site is great.) But many personal sites are leaning this way. Many so inclined are currently in an interim state not unlike what’s going on here at zeldman.com:
There are blog posts here, but I post Tweets far more frequently than I write posts. (For obvious reasons: when you’re stuck in an airport, it’s easier to send a 140-character post via mobile phone and Twitter than it is to write an essay from that same airport. Or really from anywhere. Writing is hard, like design.) To connect the dots, I insert my latest Tweet in my sidebar. I have more readers here than followers at Twitter, but that could change. Are they the same readers? Increasingly, to the best of my knowledge, there are people who follow me on Twitter but do not read zeldman.com (and vice-versa). This is good (I’m getting new readers) and arguably maybe not so good (my site, no longer the core of my brand, is becoming just another piece of it).
Like nearly everyone, I outsource discoverable, commentable photography to Flickr.com instead of designing my own photo gallery like my gifted colleagues Douglas Bowman and Todd Dominey. Many bloggers now embed mini-bits of their Flickr feeds in their site’s sidebars. I may get around to that. (One reason I haven’t rushed to do it is that most of my Flickr photos are hidden behind a “friends and family” gateway, as I mainly take pictures of our kid.) Photography was never what this site was about, so for me, using Flickr is not the same as outsourcing the publication of some of my content.
As I’ve recently mentioned, links, once a primary source of content (and page views) here, got offloaded to Ma.gnolia a while back. From 1995 until a few years ago, every time I found a good link, an angel got his wings and I got page views. My page views weren’t, brace yourself for an ugly word, monetized, so all I got out of them was a warm feeling—and that was enough. Now my site is, brace yourself again, monetized, but I send my readers to Ma.gnolia every time I find a link. Go figure.
I’m not trying to get rid of my readers, nor are you trying to shake off yours. In the short term, including Flickr, Twitter, and Ma.gnolia or De.licio.us feeds sends traffic both ways—out to those services, but also back to your site. (Remember when some of us were afraid RSS would cost us our readers? It did and it didn’t. With RSS, good writers gain readers while often losing traditional page views. But that’s another story.) I’ve certainly found new websites by going to the Twitter profile pages of people who write funny or poignant Tweets. Behind a great Flickr photo may be a great designer whose site you might not have found if not for first seeing that photo.
But outsourcing the publication of our own content has long-term implications that point to more traffic for the web services we rely on, and less traffic and fewer readers for ourselves.
This is not necessarily a bad thing. Not every person who designs websites needs to run a personal magazine on top of all their other responsibilities. If your goal in creating a personal site way back when was to establish an online presence, meet other people who create websites, have fun chatting with virtual friends, and maybe get a better job, well, you don’t need a deep personal site to achieve those goals any more.
But if world domination is your goal, think twice before offloading every scrap of you.
An authorized Belorussian translation of this article, ???? ??????????? ????, appears on designcontest.com.
Last year, a month or two before SXSW, I went on a movie star diet, all tiny portions of unseasoned unsucculent nothingness. I lost five pounds and wanted to murder the world.
This year I decided to skip desserts instead of dieting.
It’s amazing how many sweets you’re exposed to as the parent of a young child. Even if you don’t stuff your own larders with sugary treats, every weekend it’s some kid’s birthday party, where the cakes and ice cream flow like apple juice. In an environment where all that sugar and flour is normal, you partake without thinking.
So I started thinking.
Rejecting dessert soon became second nature. No birthday cake at little Johnny’s birthday bash. No fabulous pear thing when Grandma visited. No red velvet cake at the place in our neighborhood where it’s to die for. No exquisite little French pastries at the business lunch bistro. No little tin bowl of mango raisin coconut whatever at the best little vegetarian Indian place in Curry Hill. None for me, thanks. Not having any. It looks delicious, but no.
Man is a fallen creature and the devil weaves endless snares. I stuck to my no-dessert program through an onslaught of spectacular temptations. And then, like a fool, I succumbed.
Yesterday, the mother of the tot celebrating his third birthday came around with cupcakes baked into ice cream cones. Sugary vanilla frosting, M&M crumble topping, ordinary packaged cake batter, stock stubby cone—not even a sugar cone.
“No thanks,” I said, waving her away, but smiling to show that I appreciated the offer and did not judge anyone.
A minute later she came back, revolving them a few inches from my lips. “I made extras,” she said perkily.
“No thanks—well, okay,” I said, grabbing one of the things.
I wolfed it down. It was entirely as expected: an initial burst of pleasure followed by disappointment and regret. An absolutely ordinary child’s treat. Nothing special. No depth. Dutifully, no longer enjoying, I finished it all, even the dry, frostingless part deep in the little cone’s bottom.
It was like throwing away a marriage over a one-night stand with someone you met at a bus station.
Four years ago today, Tantek Çelik and Kevin Marks gave a presentation on real-world semantics. Working backwards from HTML extensions like XFN (created by Tantek, Matt Mullenweg, and Eric Meyer), the paper showed how designers and developers could add semantics to today’s web rather than starting from scratch or waiting for a “purer” markup language to bring us an “uppercase semantic web.”
As with ‘most all great ideas, the principles were simple and, in hindsight, profoundly obvious. Do what designers were already doing. Instead of toiling over new languages that might or might not get adopted, use existing (X)HTML elements such as rel and class, and agree on such things as common class names for simple things like relationship definitions.
On behalf of all web designers and developers, thank you, Tantek and friends, and happy birthday.
I’m afraid this is another of those entries outlining bizarre design decisions and perplexing usability quirks in the otherwise brilliant world of Apple computers and phones. The problem is sync. It can be done, but it often goes wrong, even for smart people who understand computers, haven’t hacked their equipment or broken the law, and are kind to dogs, cats, and children.
Here’s a particular setup: .Mac account. Tiger laptop at home, Leopard iMac at office.
On both Macs, you need to refresh your subscriptions (Calendar: Refresh All) before you sync for the first time at that location. Otherwise, sync deletes the subscribed calendars’ information. Just wipes it clean away.
And even if you Refresh All first, sync may wipe away your data, just because.
Fortunately, after sync erases your data, hitting Calendar: Refresh All again reinstates it, downloading saved data from .Mac.
Why does syncing on either Mac remove all the calendar events from subscribed calendars? It’s the opposite of what any user could possibly want. There’s not even a conceivable edge case where a user would expect “sync” to mean “I’m bored with my life. Surprise me. Make my calendar data disappear.”
One doesn’t sync to lose data. Losing data by syncing is the exact opposite of what a user expects—which also makes it the opposite of what the Macintosh experience promises and usually delivers.
.Mac sync is either partly broken; or correctly designed, but to absurdly limited scenarios; or designed so counter to a user’s expectations that it should only be run with instructions, which Apple does not provide.
Apple does not provide instructions because instructions imply a learning curve, and Apple’s pitch is that its stuff just works. One nevertheless expects at least a slight learning curve when using, say, GarageBand or Keynote. But not with sync. “Sync now” seems pretty self-explanatory, and no user doubts what’s supposed to happen.
Sync does give you a warning before dumping your data, and that warning provides a clue to what’s going wrong. It tells you that syncing will remove x number of items from your calendars, and even lists which items they are. In Leopard, it goes further, and shows you before/after views of items that will change.
Significantly, there is generally no change at all between the before and after views. Probably the “change” is to a part of the database that the user doesn’t see, and has to do with differing file formats or differing time-stamp conventions between Tiger and Leopard. A less buggy or better conceived interface would hide this non-information from the user instead of asking her to think about it.
Do I really need to see that “Lunch with Jim at 1:00” is going to “change” to “Lunch with Jim at 1:00?” Probably not, since, from my perspective as a human, the two items are identical. It’s lunch. With Jim. At 1:00.
If “Lunch with Jim at 1:00” is “different” from “Lunch with Jim at 1:00” to my Macintosh because Leopard and Tiger encode or store calendar items differently, or because Leopard and Tiger time-stamp event creation dates differently, that’s not information I need to know and it’s not a before/after view I need to see.
Before/after seems cool, and probably is if your data is actually changing. For instance, if you’ve changed one of your friend’s photos, it would be nice to compare the before and after views and decide which photo you prefer. But I’ve never seen before/after work that way. Changed photos just get changed. Before/after only seems to come into play on my networks when “Lunch with Jim at 1:00” is changing to “Lunch with Jim at 1:00.”
The irrelevancies I’ve just described must be endured, and the sequence (Refresh: All, then Sync, then Refresh: All again if data was lost during sync) must be performed in the order described, before syncing the iPhone. If, in a moment of derangement, you plop your iPhone onto its dock before doing the herky-jerky data dance I’ve just described, you will lose data not only from your iPhone, but also from .Mac, and then you will never get your data back.
Your mileage may vary.
There are always 100 people for whom everything works correctly, and some of them are always moved to tell me it works for them, and to imply that I’m somehow to blame for the obvious usability problems I’m clearly describing.
They are followed by a dozen Apple haters who want to believe that the lengthy and detailed description of a specific usability problem proves Apple makes bad products, and anyone who claims to enjoy using Apple’s hardware and software is a “fanboy.” Juvenile homophobic and misogynist name-calling often accompanies these messages of hope.
Here’s what I am actually saying.
On my two-Mac setup where one is on Tiger and the other on Leopard, I can make sync work, but I must carry out actions in exact sequences, and know the tricks to undo the damage that syncing inflicts on my data due to bizarre design decisions on Apple’s part.
A few times I have irretrievably lost data, although I was able to manually recreate it by emailing colleagues and asking, “When are we meeting?”
It reminds me of running an old analog mixing board in a dirty, smoky recording studio. Everything’s cool if you know which faders you must never touch, which inputs are dead, and how far to the left you can pan a sound source before shorting out the system.
There’s genius in the concept of sync, and it works magnificently when you’re, for instance, syncing just one iPod to just one Macintosh, always the same iPod and Macintosh.
It gets weird when syncing from home to office via .Mac across operating systems, and weirder when you throw hot iPhone action in.
How should sync work? Just like you think it should work. Just like the two arrows circling in on each other (sync’s icon) imply that it does work. Hitting sync at any time on any networked device should cause all the latest changes to be stored on .Mac and downloaded back to whichever connected device you’re using.
There’s a whole other discussion to be had on why the iPhone is supposed to sync to only one machine, (Sure, iPods do that because of DRM restrictions; but competitive PDAs can sync to any computer: home, office, you name it. Likewise with digital cameras. The iPhone is a phone, an iPod, a digital camera, and a PDA, but its syncs like an iPod, not like a digital camera or PDA, and that’s just dumb.) but we’ll save that one for a rainy day.
Sync long and prosper.
Addendum: Another crazy thing is that subscribed iCals from Basecamp don’t update upon refresh in Leopard. In iCal in Tiger, subscribed Basecamp iCals correctly refresh automatically when one selects Calendars: Refresh All. But in iCal in Leopard, subscribed Basecamp iCals do not refresh, period, no matter what one does. In order to “sync” Basecamp iCals in Leopard, one must delete the calendars every day, and subscribe to fresh copies. When one does this, one gets fresh calendar data, but sync fails due to “conflicts” that do not load in the frozen Conflict Resolver and thus cannot be resolved. This of course is not what Apple intended. It is, by any reasonable measure, an idiotic and self-defeating system. The basest ape would not design such a system. Obviously the system is not operating the way Apple intended. How does one fix it? Apple isn’t telling.
Comments are now off, but you can read what others had to say when comments were open.
Everyone a writer, everyone a publisher, everyone a citizen journalist.
Everything that could be digital would be. Content wanted to be free. Then we had to get paid. But animated smack-the-monkey ads were so declassé. Ch-ching, Google AdSense, ch-ching, The Deck advertising network, ch-ching your ad network here.
Everyone a writer, everyone a publisher, everyone a citizen journalist, ch-ching.
First the writers and designers did the writing. Then the non-writers who had something to say did it. Then the people with nothing to say got a MySpace page and the classy ones switched to Facebook.
And ch-ching was heard in the land. And the (not citizen) journalists heard it, and it got them pecking into their Blackberries and laptops.
And then the writers and designers, ashamed at rubbing shoulders with common humanity, discovered the 140-character Tweet and the Tumblr post. No stink of commerce, no business model, nothing that could even charitably be called content, and best of all, no effort. Peck, peck, send.
When you’ve flown that far from Gutenberg, the only place to travel is back.
Enter Lulu, all slinky hips and clodhoppers. Self-publishing is the new blogging. No more compromises. No more external deadlines. No more heavy-handed editors and ham-fisted copyeditors. No more teachers, lots more books.
You don’t need distribution, you’ve got PayPal. You don’t need stores: there’s only two left, and nobody buys books there, anyway. You don’t need traditional marketing. Didn’t we already prove that?
Apple and Microsoft and Netscape and Sun and Opera have been suing each other since the W3C started. What lawyers do has never stopped developers from Apple and Microsoft and Netscape and Sun and Opera from working together to craft W3C and ECMA specs.
And even if this time is different—even if, just this once, the existence of a lawsuit will stop a working group from working—I’m not sure it’s practical or advisable to cut browser makers out of the equation. For one thing, have you seen what the W3C comes up with when browser developers aren’t involved?
I can’t comment on the merits of Opera’s legal action because it is a legal action and I’m not a lawyer, let alone a lawyer versed in European antitrust law.
Based on past history, I don’t think the lawsuit will prevent the members of the CSS working group from doing their jobs. If it does, then the title of your post will be borne out, and Bert Bos, as group leader, will take action.
The web standards movement needs leaders who are passionate, but their leadership must also make sense. Proposing change when the change makes sense is good. Proposing change because you are disappointed and frustrated isn’t good enough. Anger can be brilliantly motivating; but anger is not a strategy.
[tags]webstandards, css, working group, opera, microsoft, antitrust, lawsuit, browsers[/tags]
Facebook Considered Harmless
IN 1995, I RECKONED everyone would teach themselves HTML and start homesteading on the web. When that didn’t happen, I spent three years on a free tutorial I figured would give the world the push it needed. It didn’t.
I was an early blogger and a late user of blogging software because, why did anybody need blogging software? Wrong. Always wrong.
In 2004, some colleagues and I contributed to the “new” Blogger. We were excited by the thought of bringing well-designed, easy-peasy, standards-compliant web publishing tools to millions of people. Now everyone can do this, we thought. And millions did.
But not everyone, it turns out, wants to blog. Blogging is hard. There’s, like, thoughts and stuff that you have to come up with, even if someone else handles the whole “what should my blog be like and what should it do and how should it be organized and what should it look like” part.
No, what most people were really looking for—or at least, what most people have responded to since such things became available—were web gizmos as easy as farting and as addictive as cigarettes. “Social software.” “Web 2.0.” Swimming pools, movie stars.
All this to preface the unremarkable yet strange to those who know me fact that yesterday I signed up for Facebook. And spent several hours messing with it. And checked it this morning before making coffee, before making breakfast for The Wife and I, before bringing The Child her strawberry milk.
Facebook is pretty. It works with Ma.gnolia. It works with Twitter. In theory it works with iLike, except that you can’t add an existing iLike account to Facebook, which is lame and sucks and iLike’s fault, and the fact that I care and am bothering to share such trivia shows how deeply assimilated I have become over the past 24 hours, eight of which I spent sleeping.
As when I joined Twitter, the first thing I noticed was how many of my friends and colleagues were already there ahead of me. Why none of them had invited me to join, bastards, I leave to their consciences, not that I’m bitter. They redeemed themselves by responding within an hour or less when I asked to be their “friends,” not that I’m keeping score.
I don’t need more friends and I don’t need more contacts. I avoided most of the first-generation social software that was all about Rolodex building, and only gave in to the main one everyone knows and which I shall not name when a loved old client of mine invited me to join his network. Since I made that mistake, I get lots more mail, and lots more mail is something else I don’t need.
But I design interfaces so I’m supposed to know about this stuff. That’s the rationale behind my spending hours of billable time adjusting my Facebook preferences. The real reason, of course, for all this stuff, is that it provides a way to blow off work you should be doing, while creating the illusion that you are achieving something. At least in most offices, you can’t masturbate at your desk. But you can Tweet.