Monday, August 18, 2008
I just got through a bit over half of Bryan Eisenberg and John Quarto-vonTivadar's Always Be Testing: The Complete Guide to Google Website Optimizer, on the plane on the way from Toronto to San Jose. In the last hour of the flight, I quickly scanned the remaining chapters. Funny and all too familiar story: I left the book in the back of the seat when I got off the plane, so I expect United Airways personnel, or the next passenger, to begin feverishly improving their online presence any day now.
The "hard" sections of the book are the deep underpinnings of consumer motivations: personality types, goals, personas, etc. Anyone who has attended an Eisenberg conference session will have glimpsed these. These are insights worth digesting carefully - and are the most difficult to put into practice. Professionals only, please.
The inspiring sections come early on, when the authors simply do a great job of making the case to test at all. In my mind, they're rivaled only by Seth Godin in subtly shaming marketers for allowing organizational inertia for failing to test. I particularly liked the sentence that mentioned that you can get your testing motor running by choosing a key landing page to drive paid search traffic to. (Google Website Optimizer will measure conversions from all types of traffic, but it's clear that you can accelerate your testing towards profitable conclusions by sending more relevant paid traffic to the test page in a fast spurt. Thus Google doesn't tie use of its free product to use of Google AdWords, but they certainly stand to benefit from increased advertiser confidence.)
The actionable sections are all over the place. In the first half of the book you get a nice tactile sense of what you can test right now: the key drivers that can vault a small company's conversion rates up 100%, and a large company's page performance up by 25% -- assuming they didn't suck in the first place, in which case improvements might even be greater.
There's also an interesting discussion of offbeat types of testing that measure outcomes other than simple conversions: divergent paths; specific clicks; time on site, etc. Although web analytics folks have often churned through such data and pontificated in the general direction of management, it's safe to say few have kicked it up the required notches to make those stats into actionable tests. Marketers like you, me, and the authors are evidently going to be stretching the capabilities of Google Website Optimizer well beyond its initial build. The good folks at Google Analytics and Google Website Optimizer have produced a robust initial product, but Eisenberg et al. won't just pat them on the back and leave them there. This amazing free tool is no doubt going to add a whole bunch of new capabilities on top of its existing solid core.
Cool examples abound. The small world of conversion science already holds the keys to much improved e-commerce performance, in a kind of database of ideas (not certainties... that's what they are, ideas about what you can try). Take Dell changing the phrase "Learn More" to "Help Me Choose," and then revamping some of the subsequent content accordingly. Which approach do you think works better to close a sale?
There are reasons testing aficionados will continue to run up against organizational resistance. Implicit web developer assumptions about information architecture often stop at pleasing but ultimately non-closing types of user patterns. Eisenberg et al. are no slouches at information architecture -- indeed there is a meaty section on designing better categorizations in this book. But the default initial build (or three) of a large company's site might still tilt too much toward: "put our information out there, install a cart system, and hope they buy." Conversion science is about asking for the sale, in the granular context of particular site visitors and their needs. And no, it doesn't always have to be a sale. But if it's not some kind of measurable event, then it's gossamer (ain't it?).
Comprehensive "catalog style" sections on the elements and minute sub-elements that you could test serve as a nice complement to the tactile "here's some basic ways to test" sections. Not to hype ya, but this little catalog of testing ideas could be worth tens or hundreds of thousands of bucks to your company. Eisenberg has generously open-sourced them.
I bumped into Bryan just minutes after getting off the plane, and he stressed that while the catalog-style section of testing elements is overwhelming on the surface, the book is meant to be the type of reference that sits on your desk to be used whenever it's needed. I'll certainly have one on mine, and copies for my team... after I replace the one I just left on the plane, of course!
Labels: bryan eisenberg, conversion rates, google website optimizer
Wednesday, July 02, 2008
Through the magic of dynamic keyword insertion in title, GM's AdWords ad serves up the headline "2009 Chevy Cobalt" when you type "2009 Chevy Cobalt SS". When you click, you're taken to the page for the 2008 Chevy Cobalt, of course, since they're still trying to clear those out. Time to back button: less than a second.
General Motors can sort of afford the lost click (for about $2-3) misleadingly applied to their own brand term, but for advertisers who can't afford this type of waste, the reasons for poor conversion and high bounce rates are often just as simple as that.
Labels: conversion rates
Tuesday, May 01, 2007
Looking for a white hat SEO or SEM firm to help out, company X lists their URL in their request for proposal. The site, specializing in heating & air conditioning products, has a panoply of ugly-ass, older-generation link farm links at the bottom. They're way off topic. "DSL brokers." Etc. Obviously the poor company has been hoodwinked by a link farm vendor.
But that red flag is going to make it hard for them to find a good vendor. They probably need to be cleaning up their site of their own accord, lest reputable helpers shun them like the plague, wondering what other skeletons may lie in the closet.
If a potential vendor is this shy about your home page, imagine how it must look to a customer. Conversion rate woes? Think about how credible you look to an unbiased third party... or even a biased one!
Labels: conversion rates, link farms
Wednesday, April 04, 2007
Now that Google Website Optimizer is out of beta, more businesses will begin testing their landing pages to improve conversions.
As I was fairly familiar with the basic product features already, Google's Tom Leung and I had the chance to talk informally about some of the benefits of putting this product in many hands.
One issue I raised was how to weigh "advice sessions" and "clinics" and the like. The analogy might be a bit along the lines of American Idol... it would be very entertaining to see someone donning a Simon Cowell wig and blurting: "That's rubbish! That page will never convert! I mean just look at how small that search box is, and the abominable use of tables. And that paragraph about shipping. So trite. In short, I got nothing out of this and I'm wondering right now why I even bother to sit through this." So in short, Simon's usability advice could be hit-or-miss.
Compare that with the flipside: a distinctly un-Cowell-like Talent Optimizer that would input various pitches, intonations, arm lengths, dance gestures, wardrobe elements, and facial expressions into a virtual performer... and measure the correlations of each element to positive responses from the paying audience. ("Dr. Clark, it appears the optimal arm length for Celine Dion is a full foot shorter than we've been using! Egads! And look at that fingernail data! Midnight blue is kicking butt!")
Everything in its place. Just as we don't really quite want a Talent Optimizer (though record labels and boy band promoters probably have something close to that in the underground lab) judging American Idol, we need to move beyond Cowell-like subjectivity in our ecommerce efforts. A multivariate testing process is just that.
I managed to get a lengthy riff out of Leung on the reason Google doesn't recommend Taguchi optimization. That was something I noticed right away in the Website Optimizer literature, but Leung provided more color on it. The upshot is you really do need to test all potential combinations rather than a reduced combination set, because Taguchi makes poor assumptions around variable interactions. You can semi-Taguchify your process by hand. I won't bore you with the details but rest assured that whether you go with zero Taguchi, semi-Taguchi, or Taguchi on a Taco, this will likely be a step ahead of a simple A/B test, and many steps ahead of not testing at all.
Because testing is almost always better than not testing - and usually so much better that it even compensates for the risk of "messing with" a home page that ranks well organically - it's hard to see a significant downside.
Tom and I scratched our heads a bit trying to come up with an answer for the question: roughly speaking, can you make any serious errors running such tests? Setting aside technical snafus and things you might do to ruin your website by misinstalling code (your problem), basically the answer would be no. The biggest "error" would be to pick the wrong things to test - in other words, not improving as much as you could if you did it better.
The maximum number of variations the Optimizer product will allow for a single landing page test? Leung said 10,000. I recently completed a test that involved 16, and am running one with 24 now. For these particular tests, had I done them using 10,000 combinations, they'd reach statistical significance around the year 2258. Coincidentally, that's about the time they'll have finally perfected Celine Dion.
Labels: celine dion, conversion rates, ecommerce, google website optimizer, simon cowell, taguchi
View Posts by Category