Designed by Happy Cog and launched today, The Amanda Project is a social media network, creative writing project, interactive game, and book series combined:
The Amanda Project is the story of Amanda Valentino, told through an interactive website and book series for readers aged 13 & up. On the website, readers are invited to become a part of the story as they help the main characters search for Amanda.
The writing-focused social media network is designed and written as if by characters from the Amanda novels, and encourages readers to enter the novel’s world by joining the search for Amanda, following clues and reading passages that exist only online, and ultimately helping to shape the course of the Amanda narrative across eight novels. (The first Amanda novel—Invisible I, written by Melissa Kantor—comes out 22 September.)
The site developed over a year of intense creative collaboration between Happy Cog and Fourth Story Media, a book publisher and new media company spearheaded by publishing whiz Lisa Holton. Prior to starting Fourth Story, Lisa was was President, Scholastic Trade Publishing and Book Fairs; managed the publication of Harry Potter and the Deathly Hallows; and oversaw development of The 39 Clues. Before that she spent nearly a decade developing numerous bestselling, franchise-launching series at Disney.
Happy Cog‘s New York office developed this project. The team:
Equally vital to the project’s success were Fourth Story’s leaders and partners, including:
Lorraine Shanley, Principal Advisor
Ariel Aberg-Riger (website, Twitter), Creative Development & Marketing Manager
JillEllyn Riley, Editorial Director
Dale Robbins, Creative Director
David Stack, Director, Digital Partnerships
Melissa Kantor, Writer
Peter Silsbee, Writer
Polly Kanevsky, Art Director
Sam Gerstenzang, Technology Consultant
Today’s launch is not the end of our relationship with Fourth Story Media. The Amanda Project will continue to evolve, and Happy Cog will remain an active partner in its direction and growth. We thank our brilliant collaborators and congratulate them on today’s milestone.
Back in 2000, CSS co-creator Bert Bos set out to explain the W3C’s design principles—“to make explicit what the developers in the various W3C working groups mean when they invoke words like efficiency, maintainability, accessibility, extensibility, learnability, simplicity, [and] longevity….”
Eventually published in 2003, the essay, although ostensibly concerned with explaining W3C working group principles to the uninitiated, actually articulates the key principle that separates great design from the muck we normally wade through. It also serves as a warning to Bert’s fellow W3C wizards not to seek the dark magic of abstract purity at the expense of the common good. Tragically for these wizards, and for we who use their technologies, it is a warning some developers of W3C specifications continue to overlook.
Design is for people
In his introduction, Bert summarizes the humanistic value that is supposed to be at the core of every web standard:
Contrary to appearances, the W3C specifications are for the most part not designed for computers, but for people. … Most of the formats are in fact compromises between human-readability and computer efficiency….
But why do we want people to read them at all? Because all our specs are incomplete. Because people, usually other people than the original developers, have to add to them….
For the same reason we try to keep the specifications of reasonable size. They must describe a useful chunk of technology, but not one that is too large for an individual to understand.
Over the succeeding 25 web pages (the article is chunked out in pamphlet-sized pages, each devoted to a single principle such as “maintainability” and “robustness”) Bert clearly, plainly, and humbly articulates a series of rather profound ideas that are key to the web’s growth and that might apply equally admirably to realms of human endeavor beyond the web.
The Web now runs on HTML, HTTP and URLs, none of which existed before the ’90s. But it isn’t just because of the quality of these new formats and protocols that the Web took off. In fact, the original HTTP was a worse protocol than, e.g., Gopher or FTP in its capabilities….
And that fact shows nicely what made the Web possible at all: it didn’t try to replace things that already worked, it only added new modules, that fit in the existing infrastructure. …
And nowadays (the year 2000), it may look like everything is XML and HTTP, but that impression is only because the “old” stuff is so well integrated that you forget about it: there is no replacement for e-mail or Usenet, for JPEG or MPEG, and many other essential parts of the Web.
He then warns:
There is, unfortunately, a tendency in every standards organization, W3C not excluded, to replace everything that was created by others with things developed in-house. It is the not-invented-here syndrome, a feeling that things that were not developed “for the Web” are somehow inferior. And that “we” can do better than “them.” But even if that is true, maybe the improvement still isn’t worth spending a working group’s resources on.
Shrinkage and seduction
In his gentle way, Bert seems to be speaking directly to his W3C peers, who may not always share his and Håkon‘s humanism. For, despite what designers new to CSS, struggling for the first time with concepts like “float” and the box model may think, Bert and Håkon designed the web’s layout language to be easy to learn, teach, implement, maintain, and (eventually) extend. They also designed CSS not to overwhelm the newcomer with advanced power at the cost of profound complexity. (“CSS stops short of even more powerful features that programmers use in their programming languages: macros, variables, symbolic constants, conditionals, expressions over variables, etc. That is because these things give power-users a lot of rope, but less experienced users will unwittingly hang themselves; or, more likely, be so scared that they won’t even touch CSS. It’s a balance.”)
This striving to be understood and used by the inexperienced is the underlying principle of all good design, from the iPhone to the Eames chair. It’s what Jared Spool would call usability and you and I may consider the heart of design. When anything new is created, be it a website, a service, or a web markup language, there is a gap between what the creator knows (which is everything about how it’s supposed to work), and what you and I know (which is nothing). The goal of design is to shrink this ignorance gap while seducing us into leaping across it.
What were once vices are now habits
You can see this principle at work in CSS, whose simplicity allowed us to learn it. Although we now rail against the limitations of CSS 1 and even CSS 2.1, what we are really complaining about is the slow pace of CSS 3 and the greater slowness with which browser makers (some more than others) adopt bits of it.
Note that at one time we would have railed against browser makers who implemented parts of a specification that was still under development; now we admire them. Note, too, that it has taken well over a decade for developers to understand and browsers to support basic CSS, and it is only from the perspective of the experienced customer who craves more that advanced web designers now cry out for immediate CSS 3 adoption and chafe against the “restrictions” of current CSS as universally supported in all browsers, including IE8.
If CSS had initially offered the power, depth, and complexity that CSS 3 promises, we would still be designing with tables or Flash. Even assuming a browser had existed that could demonstrate the power of CSS 3, the complexity of the specification would have daunted everyone but Eric Meyer, had CSS 1 not come out of the gate first.
The future of the future of standards
It was the practical simplicity of CSS that enabled browser engineers to implement it and tempted designers to use (and then evangelize) it. In contrast, it was the seeming complexity and detachment from practical workaday concerns that doomed XHTML 2, while XHTML 1.0 remains a valid spec that will likely still be working when you and I have retired (assuming retirement will be possible in our lifetime—but that’s another story).
And yet, compared to some W3C specs in progress, XHTML 2 was a model of accessible, practical, down-to-earth usability.
To the extent that W3C specifications remain modular, practical, and accessible to the non-PhD in computer science, they will be adopted by browser makers and the marketplace. The farther they depart from the principles Bert articulated, the sooner they will peter out into nothingness, and the likelier we are to face a crisis in which web standards once again detach from the direction in which the web is actually moving, and the medium is given over to incompatible, proprietary technologies.
Joshua Schachter explains how URL shorteners like TinyURL, bit.ly, etc., originally created to prevent long URLs from breaking in 1990s e-mail clients, and now used primarily as a means of monetizing someone else’s content, are bad:
They “add another layer of indirection to an already creaky system, [making what] used to be transparent … opaque,” slowing down web use by adding needless lookups, and potentially disguising spam.
Shorteners “steal search juice” from the original publishers. (For example, with the Digg bar and Digg short URL, your content makes Digg more valuable and your site less valuable; the more content you create, the richer you make Digg.)
“A new and potentially unreliable middleman now sits between the link and its destination. And the long-term archivability of the hyperlink now depends on the health of a third party.”
Anyone who creates web content should read Joshua’s post. I’m sold and will dial way back on my use of the zeldman.com short URL. The question remains, what to do when you need to paste a long, cumbersome link into a 140-character service like Twitter. (If you do nothing, Twitter itself will shorten the link via TinyURL.)
IA is about selling ideas effectively, designing with accuracy, and working with complex interactivity to guide different types of customers through website experiences. The more your client knows about IA’s processes and deliverables, the likelier the project is to succeed.
Agile development was made for tough economic times, but does not fit comfortably into the research-heavy, iteration-focused process designers trust to deliver user- and brand-based sites. How can we update our thinking and methods to take advantage of what agile offers?
About the magazine
A List Apart explores the design, development, and meaning of web content, with a special focus on web standards and best practices. Issue No. 273 was edited by Krista Stevens with Erin Kissane and Carolyn Wood; produced by Erin Lynch; art-directed by Jason Santa Maria; illustrated by Kevin Cornell; technical-edited by Aaron Gustafson, Ethan Marcotte, Daniel Mall, and Eric Meyer; and published by Happy Cog.
As in finance, so on the web: self-regulation has failed. Nearly ten years after specifications first required it, video captioning can barely be said to exist on the web. The big players, while swollen with self-congratulation, are technically incompetent, and nobody else is even trying. So what will it take to support the human and legal rights of hearing impaired web users? It just might take the law, says Joe Clark.
When broken links frustrate your site’s visitors, a typical 404 page explains what went wrong and provides links that may relate to the visitor’s quest. That’s good, but now you can do better. With Dean Frickey’s custom 404, when something’s amiss, pertinent information is sent not only to the visitor, but to the developer—so that, in many cases, the problem can be fixed! A better 404 means never having to say you’re sorry.
A proposal for a fonts working group is under discussion at the W3C. The minutes of a small meeting held on Thursday 23 October include a condensed, corrected transcription of a discussion between Sampo Kaasila (Bitstream), Mike Champion (Microsoft), John Daggett (Mozilla), Håkon Wium Lie (Opera), Liam Quin (W3C), Bert Bos (W3C), Alex Mogilevsky (Microsoft), Josh Soref (Nokia), Vladimir Levantovsky (Monotype), Klaas Bals (Inventive Designers), and Richard Ishida (W3C).
The meeting started with a discussion of Microsoft’s EOT (Embedded OpenType) versus raw fonts. Bert Bos, style activity lead and co-creator of CSS, has beautifully summarized the relevant pros and cons discussed.
For those just catching up with the issue of real type on the web, here’s a bone-simple intro:
Microsoft’s EOT (based on the same standard CSS mechanism) works harder to avoid violating your licensing agreement, and has long worked in Internet Explorer, but is not supported in other browsers, is not foolproof vis-a-vis type foundry licensing rules, and may also cause PC security problems.
The proposed fonts working group hopes to navigate the technical and business problems of providing real fonts on the web, and in its first meeting came up with a potential compromise proposal before lunch.
Like everyone these days, the W3C is feeling a financial pinch, which means, if a real fonts working group is formed, its size and scope will necessarily be somewhat limited. That could be a good thing, since small groups work more efficiently than large groups. But a financial constraint on the number of invited experts could make for tough going where some details are concerned—and with typography, as with web technology, the details are everything.
I advise every web designer who cares about typography and web standards—that’s all of you, right?—to read the minutes of this remarkable first gathering, and to keep watching the skies.
Fast high-speed access for NYC internet professionals
I’m home watching a sick kid and waiting for Time Warner Cable to come make a third attempt to install a cable modem. If you’re good at math, that means Time Warner Cable, the market leader in my city, has twice failed to install the correct cable modem in my home.
Because the web never sleeps, even web professionals who work in an office need reliable high-speed access when they are at home. Speakeasy provided that service via DSL in our old apartment (our previous DSL provider having been wiped out, literally, on September 11, 2001), but, as documented in old posts on this site, it took two months of comedic mishap for Speakeasy to get our home DSL working. And after Best Buy bought Speakeasy, it became harder and harder to contact the company’s technical support people to resolve service problems—of which there were more and more. By the time we moved out of our old apartment in December, 2007, frequent gapping and blackouts made our 6Mb Speakeasy DSL service more frustrating than pleasant to use.
The monopoly wins the bid
So when we moved to the new apartment, we decided to immediately install cable modem access as a baseline, and then secure reliable DSL access for redundancy. Time Warner Cable had set up a deal with our new building, and no cable competitor was available to service our location (you read that right), so the Time Warner got the gig. They came quickly and the system worked immediately. The digital HD cable fails once a week, probably due to excessive line splitting, but that’s another story, and we don’t watch much TV, so it doesn’t bug us, and it isn’t germane here.
Unwilling to repeat the failures and miscommunications that marked our Speakeasy DSL installation, I went ahead and had Time Warner Cable set up the wireless network. It costs extra every month, and Time Warner’s combination modem/wireless/Ethernet hub isn’t as good as the Apple Airport devices I own, but it makes more sense to pay for a system that’s guaranteed to work than to waste billable hours debugging a network.
Due to the thickness of our walls, the wireless network never reached our bedroom, but otherwise everything was hunky-dory. Within a few days of moving in, we had reliable, wireless, high-speed internet access. Until Time Warner told us otherwise.
Last spring we received a form letter from Time Warner stating that they’d installed the wrong modem, and that we were not getting the service we’d paid for. Apparently this was true for all customers who chose the service. Some of our money was refunded, and we were advised to schedule a service appointment or come to the 23rd Street office for a free replacement modem.
I went to the 23rd Street office, took a number, and within about fifteen minutes I was sitting in front of a representative. I showed him the form letter and requested the new modem.
He asked me for my old modem.
I said I hadn’t brought it, and pointed out that I hadn’t been instructed to bring it.
We both reread the form letter.
“It’s implied,” the rep said.
“Implied?” I said.
“Sure,” he said. “If we’re going to give you a new modem, of course we’ll want your old modem.”
I guess it was implied. But it wasn’t stated. And when you charge an installation fee, a hardware fee, and a monthly service fee, and then give people the wrong modem, you probably shouldn’t rely on inference in your customer support copy. To avoid compounding your customer’s frustration, you should probably be absolutely explicit.
I didn’t say these things to the rep, because he didn’t write or approve the copy or send the wrong modem to all those homes. I left empty-handed and continued to use the modem we had. There didn’t seem to be anything wrong with it. Whatever the poorly written form letter had to say about it, as a customer, I didn’t have a problem with the modem.
A visit from a professional
As summer ended, Time Warner Cable sent me a new form letter. This time I was told, rather darkly, that if I failed to replace my modem, I definitely would not get the service I was paying for. Indeed, my service level would somehow be lowered, although it appeared that I would continue being billed a premium price.
So I called Time Warner, arranged a service visit, and spent the day working at home.
Around the middle of the service window, a Time Warner Cable authorized technician showed up with a regular DSL modem (not a wireless modem).
“You have wireless?” he asked in amazement.
“Yes,” I said. “Doesn’t it say that on your service ticket?”
“Hey, I’m just a consultant. I don’t work for Time Warner Cable,” he helpfully informed me.
“So are you going to get a wireless router from your truck?” I offered after a pause.
“I don’t have those,” he said.
We looked at each other for a while, and then he said, “Besides, you don’t need to replace your modem. There’s nothing wrong with it.”
“There’s nothing wrong with your modem. You don’t need to replace it,” he said.
Then he called someone to inform them that he hadn’t swapped modems.
Then he asked me to sign a form.
“What am I signing?” I asked. “That you didn’t do anything?” I said it more politely than it reads.
“You’re signing that I was here,” he said. So I did.
That evening, as I was bathing my daughter, Time Warner Cable called to ask if I was satisfied with the experience.
I said frankly I was confused why I’d had to stay home all afternoon for a service visit on a modem that didn’t need to be replaced.
The nice lady said she would talk to her supervisor and run some tests.
I was on hold about five minutes, during which my daughter found various ways of getting water out of the tub and onto me.
The nice lady came back on and said, “I’m sorry, sir, but we just ran tests, and you do have the wrong modem. We’ll need to send someone out.”
So here I am, two weeks later, waiting for a technician to come try again. Will this one bring the right hardware? The suspense is awesome.
Although New York is a leading creator of websites and digital content, the town’s home and office internet connectivity lag behind that of practically every other U.S. city. Two factors account for it:
An aging infrastructure. It’s hard to deliver best internet services over a billion miles of fraying, overstretched, jerry-rigged copper line.
Monopoly. How hard would you try if you had no real competitors?
In future installments, I’ll discuss our adventures securing high-speed access to our studios at Happy Cog New York, and discuss the pros and cons of Verizon home DSL.
It’s back, it’s improved, and it’s hungry for your data. It’s A List Apart’s second annual survey for people who make websites.
Last year nearly 33,000 of you took the survey, enabling us to begin figuring out what kinds of job titles, salaries, and work situations are common in our field.
This year’s survey corrects many of last year’s mistakes, with more detailed and numerous questions for freelance contractors and owners of (or partners in) small web businesses. There are also better international categories, and many other improvements recommended by those who took the survey last year.
Please take the survey and encourage your friends and colleagues who make websites to do likewise.
[Comments off. Pings on.]
[tags]survey, web design survey, webdesign, webdevelopment, professional, alistapart[/tags]
CSS Menu Writer debuts
Launched today, WebAssist Professional’s CSS Menu Writer™ for Dreamweaver takes the pain out of creating standards-compliant horizontal or vertical navigation menus with nested fly-outs.
I got to spend an hour with the program prior to its release, and was impressed with its flexibility and extreme ease of use. For instance, creating primary and secondary menu levels is as simple as pointing to your files and folders. If the client changes the approved site structure after you’ve already created your page templates, no problem: just drag files and folders to their changed locations and CSS Menu Writer will update your navigation.
The program comes with four horizontal and four vertical menus, each in 12 different color schemes—96 menus to start—with unlimited sub-levels. You can easily create Doug-Bowman-style “sliding doors” effects, as well as doing all the obvious stuff you’d expect to be able to do, like changing menu width, height, margin, and padding; swapping backgrounds and images; and saving custom creations as new presets to reedit or share with colleagues. The program also integrates easily with Eric Meyer’s CSS Sculptor.
CSS Menu Writer costs $99.99, but if you buy before May 27, it’s just $74.99.