Of Google and Page Speed

Our visually and behaviorally rich sites are about to lose precious Google juice, WebSiteOptimization.com reports in a new piece titled Page Speed Factored into Google Search Rankings:

Google’s addition of a page speed signal to its search rankings algorithm officially links performance with search engine marketing. The loading speed of a web page affects user psychology in a number of ways, and now it can effect its rankings as well.

This back-to-basics message catches us at a funny time in web design history.

“Make more of less” has long been the norm

Most of us who’ve designed sites for quite a while, and who consider ourselves user- and standards-focused, have traditionally designed sites that loaded faster than the competition. We did it by using caching technologies (CSS instead of table layouts, linked instead of inline JavaScript, and so on). For many, many years, we also did it by keeping images to a minimum, using system fonts instead of pictures of type, CSS colors instead of faux backgrounds, and so on.

As the web audience grew, heavily trafficked sites became even more restrictive in their decorative flourishes, whether they cared about web standards or not. Thus Google, while happily using bad CSS and markup, exerted monk-like discipline over its designers. Not only were images out, even such details as rounded corners were out, because the tiny images needed to facilitate rounded corners prior to CSS3 added a tenth of a kilobyte to page weight, and a tenth of a kilobyte multiplied by a billion users was too much.

Of late, we have grown fat

Yet in the past few years, as wideband became the norm, every mainstream site and its brother started acting as if bandwidth didn’t matter. Why use 1K of web form when you could use 100K of inline pseudo-Ajax? Why load a new page when you could load a lightbox instead?

Instead of medium-quality JPEGs with their unimportant details painstakingly blurred to shave KB, we started sticking high-quality PNG images on our sites.

As these bandwidth-luxuriant (and not always beautiful, needed, or useful) practices became commonplace on mainstream sites, many advanced, standards-focused web designers were experimenting with web fonts, CSS3 multiple backgrounds, full-page background images, and other devices to create semantic, structurally lean sites that were as rich (and heavy) as Flash sites.

So now we face a dilemma. As we continue to seduce viewers via large, multiple background images, image replacement, web fonts or sIFR, and so on, we may find our beautiful sites losing page rank.

Read the report and watch this space.

33 thoughts on “Of Google and Page Speed”

  1. I think Google will only penalize the criminally slow, so that even websites that don’t follow all good practice will still ‘do okay’. I quote: “fewer than 1% of search queries are affected by the site speed signal in our implementation”.

  2. Personally being an accessibility person and a developer I really don’t have to worry about this issue. My website is built with XHTML and CSS. Once in a great while I will add an image into one of my blog posts.

  3. Well said. It’s going to be tricky now choosing speed over looks and functionality. One other big dilemma is: What to do about all those sites we’ve built during our ‘growing fat’ period? oh dear! ;) I’m going to really miss high quality png’s!

  4. I’m with Ian. Based on what Google has said, I don’t think this is going to be a real-world issue for hardly anyone.

    That having been said, I do think we’ve gotten a bit lazy about making our sites fast, and it’s kind of sad, because the newer technologies and techniques you mention (AJAX, better image formats, CSS3, etc.), as well as many you don’t (everything from memcached to nginx to NoSQL to YSlow to Comet) actually set us up for having much, much faster websites than in days past. We have all the tools, we’re just often lazy about using them.

  5. Hmm, I think AJAX could actually help the situation. Sure, you have a bit of a hit for the scripts loading, but if they are minified that is minimal.

    The thing about AJAX is that the page is technically loaded before it does anything. Do you have a large table or interactive search that happens after a button click? Rather than spending time querying and building tables etc on the backend (thus slowing down initial load times) simply retrieve it when the user needs it.

    Its a good thing overall for Google to push people in this direction. But like Ian points out, it will probably only matter for extreme cases. Really, I believe Google is in the business of giving results that people like. Many times, people like an interactive experience (especially video). Content is king too, regardless of if it takes 2 seconds to load instead of 1 second.

  6. Yeah I agree with Jason P, content has to be king, load time doesnt make it any more relevant. In fact I fail to see the connection at all – it seems more like Google trying to steer the direction of SEO and overall site construction. Thankfully they dont prefer table layouts or something ridiculous, because with these kinds of motives anything is possible.

  7. I’d like to see how Google handles Flash sites in this respect. The HTML/CSS/JS container site for your average big ad agency Flash site is generally tiny – all the painfully slow loading happens inside Flash. I’d hate for us to be penalised simply because we don’t hide our page load inside a SWF.

    Or worse still, I’d hate for these sites to be rewarded because the part that Google sees (“You need Flash to see this site”) is so quick to load.

  8. Vaughn-

    Google is simply trying to steer users towards relevant and satisfying results. A page that is painfully slow is not relevant (because you can’t load it), nor satisfying (for obvious reasons). I think it makes perfect sense for Google to put less weight on sites which are critically slow. Google is only penalizing those which are criminally slow, not sites which are simply a bit pokey.

    I feel a mass-freakout coming on. Please, let’s be rational. Google began using speed in its results a few weeks ago, and no one noticed the change. There is absolutely no evidence to back up the concern that your pages are suddenly going to drop in their results because they aren’t fast enough, so there’s really no need for freakouts or cries of Google being evil.

    Google’s business is built on giving users good results for their search queries. They’re damn good at t, and they have been for years. They’re constantly making changes to their algorithms — some we hear about, and some we don’t. And yet, their engine continues to give great results.

    There’s nothing to worry about, here. Relax, everyone.

  9. This feels like a problem that Google is universalizing onto everyone. Sure, 0.1K x Google’s daily traffic = a major difference, but how many of us design sites of that scale? At the very least, I don’t.

    For our business’s purposes, the added weight of a few more graphics and scripts produces a better interactive product. This furthers our business’s ends. I already author my sites with clean, tidy, efficient practices. Could I minify code, heavily compress images, and strip out superfluousness at every turn to shave a few KBs and increase load times by a human-imperceptible amount? Absolutely. Should I? I tend to think not — it seems counter-productive.

    I understand the surface-level rationale for this, and bravo to Google for that. Even so, I can’t help but think that Google is making its problem mine. To that end, I hope Ian Parr and Jeff Croft are right that, for most pages, this won’t affect most rankings significantly.

    Thanks for sharing this, Zeldman.

    (Unrelated aside: How great is the Web that a nobody like me can share my thoughts in the same space as somebody like a Jeff Croft, whose work I greatly admire? The Internet is a wonderful place.)

  10. I am glad to hear load time will be factored into page rank, While we have all these technologies to make our pages heavy when used well they can add so much to a site with less weight.

  11. If you build sensible/user centered websites, you have nothing to worry about. If you build image heavy bandwidth monsters disguised as websites, shame on you, and now you must be punished.

  12. I don’t think this is so much an issue of “maybe we should pull back on the use of developing technologies” as it’s a case for reduced use of resource-intensive tools such as Flash. Like Ian, I would think that for the majority of standards compliant websites – even those that use newer methods such as jQuery and CSS3 – this won’t have that much of an impact. In fact, it may allow them to stand out as models of more efficient development, and present another bargaining chip for advocating standards based design to clients. I think it’s too early to gauge what real effect this will have on websites, though.

    Your points regarding ‘laziness’ definitely contain a grain of truth, though. Thus far, I’ve experimented here and there, but have always used newer techniques sparingly, although it’s been tempting to deliberately cater to a certain demographic in order to tailor my blog to be less friendly to Internet Explorer.

  13. Adding performance to the Google alorithm is actually a great thing – study after study is finding that users really care about how quickly a site loads. Even if Google wasn’t using page load time as a factor, it’s something that should be paid attention to.

    And I have to agree with Jeff – I think a lot of people have the misconception that to have a site that loads quickly and performs well means they cannot make something with beautiful graphics and beautiful effects, and are therefore on the verge of “freaking out”. This just simply is not the case.

    Take the Happy Cog site (though it’s admittedly not a very image heavy site) for example. A quick run through Smush.It shaves ~15% off the image sizes and running the two unminimized scripts through a compressor shaves ~30% off the combined size of those two.

    In addition, Gzip could be enabled and the Javascript could be in the footer (and combined into one file if you wanted). Those are just a few quick changes that would already significantly improve the performance of the site – and the appearance would not be affected one bit.

  14. While Google is going after the morbidly obese in their search results (and kudos), users are demanding quicker page downloads like we did in the modem days especially because of the explosion in mobile devices like the iPhone and now the iPad. Users want a relatively same user experience on a device pulling from a crappy AT&T 3G network. Standards baseds sites that are flexible enough for the mobile device (like this site – it looks great on the iPad I’m typing this on) are going to win out. I hope Google search results being served from a mobile device weigh page load speed even more in those types of results.

    I love the clients who come to me and ask why their sites are slow or don’t look good on the iPhone or now the iPad, and I show them the iPad optimized sites being featured on the Apple site or better a site that works on multiple devices. “Remember when I was going on about creating a standard based website and you went with that high priced interactive firm to create your Flash based site? This is what I was talking about!”

    This Google report just gives us standards loving designers yet another arrow in our quiver when we educate our clients on making better business and flexible design decisions.

  15. I think we need to be careful not to conflate two different things under “page speed” here. Page ‘speed’ can be slow web servers resulting in long HTTP requests (as the googlebot would encounter), and I assume that’s the major issue Google is measuring with regards to speed — the speed pages are served at, not the speed they render images/JS/whatever.

    There’s mention of Google using Google Toolbar data too, but it’s unclear exactly what this means. It’s possible this has more to do with page rendering speed, as opposed to page serving speed — anyone have more info on this?

  16. I think some clear standards are necessary. Usually page speed has a more Darwinian effect on your traffic (especially the bounce rate) but adjusting your ranking is kind of harsh.

    I factor bandwidth requirements into every site I build, but I base it on the targeted audience. If you are surfing the web for luxury cars, I might assume you have bandwidth and serve you up nice full-screen, high-res images, some nifty Flash or JQuery stuff too.

    Take Audiusa.com as an example. It rates at 71 out of 100 on the firebug pagespeed plugin. Does this mean some garage that services Audi might rank higher based on a plain Jane site?

  17. Seriously, if the new youtube site design is any indication of what Google would like to see around the web, I’m frightened by this new element to their algorithm. Google is notorious for taking things to the extreme of simple—something that is good and even great in it’s own right. But it too can be taken too far. Apple has a much better grasp of the balance that is needed between simple and elegant.

    Apple’s focus is on heuristics, whereas Google, I fear, is too far focused on stringent rule-following and fanatical about mathematical approaches to human problems. Their “simplify” chorus may sound good on paper, but in the real world (as Douglas Bowman has indicated) it can get a team of professional elites mired in endless testing for useless endeavors. http://bit.ly/google-duh

    Has anybody thought of this, another angle for Google’s concern in this area: If Google is providing pipe to major cities, free broadband or of some other variety, they may very well be trying to pre-emtively conserve on resources for their own profit. The lighter the pages, the lower the impact on their service. The heavier the pages, the lower they get ranked so they are less of a burden on Google’s infrastructure. Just a thought.

    Either way, I agree with Vaughan Magnusson. I don’t see the relevance, at least not to the level some others have stated. Now, if this is merely aimed at the sites that are taking an extreme amount of time (say 2-3 minutes) to load, then I suppose I’m for the change as well. But if we’re getting into a war between the 2 second vs. 4 second page load times, I call foul on Google.

  18. Jeff, I can appreciate your point, a painfully slow page is unquestionably unpleasant to the user and should be discouraged.

    The point I’m making however, is that say for example flowers and gentle waterfall sounds in the page background gave better readings for blood pressure or perceived credibility – should this be taken into account when ranking results? Clearly no, and while this is ridiculous example, the same reasoning is at play.

    I suppose my question is, that if a user responds more favourably to a certain operating environment, is this more important than the accuracy of the result? Googles move would indicate that they believe this “operating environment” further refines the accuracy of the result – but I’m not so sure.

    Also I’m with you on that one Ron, there are other more commercial interests that could be at play here with Google, surely data/bandwidth is a very real factor in their bottom line.

  19. I’m in agreement that we’ve gotten “fat”. I often have to argue about requested designs or features being too large on sites I’m to design and build only to hear that “it doesn’t matter anymore” from folks who have no clue about bandwidth in the first place.

    It’s simple – mobile web use is up and growing while mobile networks are notoriously saturated (iPhone especially). Small lean web sites are now in demand again for this reason, even if you’re pushing separate mobile versions from your “desktop” pages.

    Faster is always better with any web page and often the optimizations pay off in reduced hosting costs and hopefully easier to edit source.

  20. I don’t want to say that I wrote about this issue in any of the discussions at ALA before, but it seems like not everyone in the community wanted to believe that both loading times and website size still matter. In fact it looks like you all have forgotten where we come from. And there are still some people (nations) left. While we have been very strict with browser compatibility in the past, our behavior in kinds of download speed have been very generous.

    But now we shouldn’t forget everything we’ve been used to in the past 2-5 years. Styling a website for a better, a living experience of the web is still necessary. Using all damn gimmicks like some did in the late 90’s isn’t in any way helpful for evolving the web.

  21. @SeanJA OK that was an overly simplistic example, that page wouldn’t rank. But think of a progressively enhanced Flash page, with proper fall-back content (i.e. a HTML version of the site sitting underneath for non-Flash users).

    The HTML version might be a perfect, standards-based, fast-loading HTML/CSS site, and this is what Google will see (and, I assume, use to measure a page’s loading time).

    But the Flash version that is shown to most users might not be as fast-loading – it has all the bandwidth-heavy bells & whistles required by a multinational’s marketing department. The wrapper SWF that’s actually embedded on the page can be tiny, but is used to load heavier content in.

    Will Google penalise a 50kB page with an embedded SWF that loads another 10MB of content, as opposed to a 300kb pure HTML/CSS/JS page? Hope so…

  22. Hm, that’s been rumored for a while but it seems it’s getting closer to reality.
    Those of us using Google Webmaster Tools will find a Site performance option under Labs. I discovered today that it now takes cachable, google-served JavaScript into account. GZIP seems to be in high esteem aswell.

    The report is something along these lines:

    On average, pages in your site take 3.2 seconds to load (updated on Apr 10, 2010). This is faster than 52% of sites.

    I’d be very interested in how Google treats Flash and how about the case where there’s a graceful fallback?

  23. Regarding “of late we have grown fat”, I think we all need to remember the statement (cannot remember who said this originally) that just because broadband is 10x faster, doesn’t mean that an image (read website) should be 10x bigger; users expect 10x faster speed so the image (website) should be the same size.

  24. That looks like a good example of a de facto monopoly forcing changes and decisions on a whole industry.
    Basically Google think every web site on the planet should look lile their homepage?
    ;-)

  25. Fantastic.. something else I slap my web design students around the head with as they yet again ignore my plea to optimise images!

  26. A sidebar would be to think about the data side of the house. I have found that Steve Sounders – http://stevesouders.com/ – explains the impacts of bloated websites and the issues from a data delivery perspective. You know the benefit of sprites one image that is 10k can render a page faster than bunch of tiny images individually. But this also has an impact on the server(s) that display content. Each request to the server requires the server to do something. Being smart can reduce server load. The other and more relevant point to this discussion is more pertinent to large sites serving lots of content. Consider the bandwidth savings, storage savings, power consumption, additional server clusters etc. No need for Nielsen like text only approaches, but being smart about imagery, JS, CSS etc and lead to efficient and fast sites.

  27. I don’t believe a site should be ranked lower because it uses more images, I personally really like nice and sharp images. As long as sites are not penalized for having extravagant designs there will be no problems. To improve the efficiency of the internet as a whole is a great idea. The faster and more responsive websites are the better. Perhaps a move in this direction will stimulate growth of better technologies for loading beautiful websites faster. I also like the idea that clients would require further SEO services on their sites that already exist, perhaps creating a few more clients for people like us.

    Google’s power over the internet does worry me though. We have seen how such power can hurt the internet [IE6].

  28. Designers and developers really should be re-educated about bandwidth and page speed, but not for the reason you mention. Page rank will not be affected for 99% of all sites, so the “dilemma” is not for the sake of SEO.

    The real dilemma is because internet access is still spreading out to emerging countries with poor connections, and even to developed countries by increased mobile usage. On top of that, existing users expect snappy web pages, where before they’d happily wait for this wonder the internet to load a page in a minute. Those three combined are the reasons speed matters, not this 1% SEO thing.

  29. Ian and Jeff are 100% correct. This isn’t about sites that may be a little slow. It should have zero impact on sites that are served with reasonable speed. It’s about the behemoth sites served on slow connections.

    Has anyone considered that it may partially be marketing? Google knows the influence they hold over the web community. If they want to push the trend toward more optimized sites (which leads to a better experience for everyone) announcing policies like this is a good way to do so.

    Also, by de-emphasizing slower sites I’m sure they can save — or at least better utilize — precious CPU cycles when spidering sites as well. Even lowering the request time-out threshold by fractions of a second could lead to sizable efficiency gains. Then there is the bandwidth play Ron mentioned.

    Bottom line: Just continue to make clean, semantic, optimized sites. You’ll be fine.

  30. I have to honest. This is the most ridiculous debate. Being a web designer/developer and a professional musician I find the majority of the websites out there extremely boring and mediocre. Most follow the same bland uninteresting formats and visually they are pretty pathetic. Understanding how Enterprise sites must be what they are, small corp sites must follow what their customers want, etc: However, in the music industry and visual arts industry we WANT sites that are pleasing to the eye AND interactive. The sites are rich in photographic content, musical content and now video content. If the site takes 2 to 3 seconds to load with a preloader….who cares??..Usually and I repeat, Usually the sites are beautiful, full of vibrant content and even exciting at times. We in this industry, music and film, fashion, art….don’t want to read pages and page of some clowns sudo-witty drab. We want to celebrate our creations and others creations….like music, art, photography, film and personal galleries. I read a lot of books so do not find it necessary to read pages and pages of other peoples opinions. The web for me and hundreds of thousands of people in the performing arts world are there to look at beautifully created websites and their contents. Yuppies over there and Artists over here. We should not be penalized for content. Have a little patience for Gods sake. Google should be “Bitch-Slapped”.If 3 to 4 seconds is to long for you to wait as the preloader loads a content rich site, well………..there’s something very wrong with you

  31. I definitely felt this slap from Google. After it went live my rankings tanked across the board. It took me 6 weeks to figure out. I cut my template size in half and now my rankings are climbing back up. It was slow on the webmaster tools graph. Now it touches the fast rating on the google graph. No content or link changes, only reduced template size and image size. I have head of about 6 other people having the same issue.

Comments are closed.