Getting credit for an online conversion - and giving due credit to all recent influences - has been one of the hottest topics in digital marketing over the past couple of years. The urgency of the matter has grown as media costs -- especially click prices on paid search keywords -- have risen.
Marketers have been so hungry for better attribution of "keyword assists" (or simply, the non-overriding of the first click in the sequence towards purchase, whether that's over a matter of hours or many months), they've been willing to explore cumbersome customizations in a variety of analytics platforms, including Google Analytics.
But if you're looking to simply analyze the contribution of paid keyword searches on Google Search that preceded the keywords that led directly to a sales conversion (aka "assists"), you'd prefer to see all that data rolled up conveniently within Google AdWords itself, showing the data in handy formats that might make it easy to change your bidding patterns. In particular, earlier stage keywords (typically, before a last-click brand search) would now be revalued in your model; you'd bid them higher in cases where they made assists.
Earlier, when I defended the "last click"'s merits as an attribution method, I pointed to some data by Marin Software showing 74% of etail conversions only have one associated click - even counting assists. Moreover, Marin's approach bucketed prior clicks categorically, arguing that if a prior click was very similar in intent or style to the last click, then the extra information wouldn't be enough to cause you to alter bidding patterns anyway. That knocked the number of truly "assist-powered" conversions (that you could actually attribute properly) down to 10% or less.
This is where Google's new reporting needs to be scrutinized closely. In your individual case it could be quite valuable, but in current individual case studies Google may have on hand, anywhere from 70-95% of conversions only have one click to speak of. If Marin's logic above is even close to sensible, then it does underscore the limits to assist data. There will be some value attributable to assist keywords in around 10% of conversions, give or take. That's actionable but not earth-shattering. Of course, this is going to be most valuable to advertisers who have a lot of prior influencer clicks hiding behind a high number of clicks that are currently attributed to a last-click on the brand name.
To pump up the role of prior keywords, it might be fair to also point to assist impressions - views of the ad on Google Search where the ad wasn't clicked, but shown. But in those cases was the ad really seen? Perhaps not, but there may be some value in knowing what search keywords got the searcher's research motor running. Perhaps they clicked on a competitor's ad. Google is offering impression assist data as well with this release, which will be sure to delight trivia buffs, AdWords junkies, and Google's accountants alike.
Remember, we're not just talking about multiple searches all done in a single day, or in one session. Google is logging the time and date of every search by that user prior to a purchase/lead, and when a conversion happens, full funnel information is available as to the time lag between clicks and before the conversion.
Adding in impression assists to the mix, we may see past search query information for up to 20-25% of conversions in some advertiser accounts. Again, while not stupendous, this at least counts as extremely important and material to how you approach keyword value.
The ease of sorting in order of frequency of conversion by assist keyword helps not only to see the keywords in question, but with the "keyword transition path" view, you can see what last click converters they preceded, to better understand the consumer mindset. The screen shot below is a canned Google example while the program is still in beta. In my briefing I saw a more typical and valuable case example that showed the frequency (fictitious example to replace the one I saw) paths like "almond milk calories" > planethealthnut or "milk alternative" > planethealthnut. Whereas the brand might have got disproportionate credit for this conversion in the past, now, keywords like [milk alternative] or [almond milk calories] might attract higher bids, even more so if you experiment over time, allowing for more repetitions of your "research stage keywords" over many months.
In my opinion, "paths" work fairly well as a metaphor here and are not too misleading because the "funnel" steps tend to be relatively coherent and causal in practice. They aren't necessarily so, however. The reason these reports can look sensible is because they're drawn from a narrow universe of high-intent keywords that advertisers are avidly bidding on. You're not going to see a paid search keyword funnel path like "drawbridge in mexico" > james mcbleckr phone 415 > nike > air jordans used > nike.com largely because Nike doesn't have most of the keywords in that path in their paid search account. Truly generating causal paths out of all the things someone does online prior to a conversion is likely to be incredibly messy, but that's a much longer story.
Long story short: life is indeed a lot simpler when viewed through the prism of an AdWords account. And today, advertisers are getting what they desperately seek: easy-to-use information about paid keyword search attribution so that the last click doesn't override all other attribution data.
"This is not about measurement, but about organizations and their capacity to manage this changing real world of reputation. They find it hard to do this, because they're stuck in an old-world broadcast model."
The speaker: Bryan Eisenberg. The setting: a past SES London conference, at an All-Star Analytics panel.
This year's SES London is, once again, particularly heavy on Analytics all-stars. As it should be. Search marketing is accountable, performance is king, and any digital marketer can improve their lot by doing a better job with the analytics toolkit.
But some people and some companies will confuse that importance with a blind faith that the measurement gurus can solve their larger strategy problems. Even the esoteric ones. Like the above question that came before the panel, roughly speaking, asking: "Hey super smart panelists, can you tell me some metrics that will help us decide whether our social media is WORKING?"
The inside-the-box answer is: measure this, measure that, and adopt the same approach to social media as you do to other channels. If you "scored," it's "working." Of course, you can measure a lot of these types of things -- just not with Omniture. Remember, the old public relations world, where a "positive mention" in the "Washington Post" is something you can count? You don't need Jim Sterne or Steve Rubel to tell you that. Nor can these experts help your organization get really good at all the stuff it needs to do to get there.
The out-of-the-box (Bryan's) answer to the social media measurement question is: sure, we'll get around to the measurement piece -- but if you're looking to "hit targets" with your social media spend, or to measure whether "it worked," maybe your organization has given you the wrong marching orders. We don't need more statisticians in this realm: we need more companies who are willing to fundamentally transform the ways in which they communicate.
I say again: (or actually, the panelists said it last year): "Can you put a dollar value on a conversation?"
Sure, you can. But let's start with getting your organization aligned with the idea of a conversation first. The only social-media-savvy company initiatives that will typically hit short-term targets are those that are architected on a broadcast model, so they defeat one of the key purposes of public relations, which is to change perceptions. And to put specific content into the public's awareness of you. To position your organization to carry on conversations that lead to business results, throughout the organization, over time, as a matter of course. Setting up your campaigns based on thin measures of short-term success might actually spur more negative conversations than positive! Or just not get you anywhere fast. You can measure that you're not getting anywhere fast. Great.
Reputations are built over time. You'll never get there if your organization has a bias for shutting down the conversation channel early because "it isn't working." Or you're insulting members of your community by being too goal-directed in online conversations, because you've incentivized your community manager by paying them a bonus for warm leads or upsells.
Am I saying you can't or shouldn't measure PR 2.0? Of course not. But if the milieu is vastly different, then you may have a lot of trouble measuring the impact, and you should probably be measuring something very different. Something that might not even be readily available in today's Google Analytics platform. "Engagement" can't just be about spending 3:28 on a website, or deciding whether someone visited the "About Us" page... as important as those may be in the ordinary course of your marketing planning.
Or to be blunt: before going out to hunt for a next-generation tool to measure "how you're doing out there," you should be actually getting out there, and doing it. If you're not? There are a bunch of free tools like Yahoo Site Explorer, Backtweets, and Google Search itself that will tell you in a couple of nanoseconds if nobody's linking to you, and nobody's talking about you.
I'm most of the way through Seth Godin's great new book, Tribes.
It's a book about leadership, and the need for leadership at every level to achieve meaningful change in any setting.
It's also about distinguishing between a passionate group that can "go places" and achieve something together, as opposed to a loose aggregation of people who don't really care enough to matter.
Great changes can be achieved by leaders with faith in their vision, followed by a relatively small number of people who help make it happen. One of many examples Godin gives in the book is the animal shelter activist who was able to make San Francisco into a No Kill zone (typically, 70% or more of healthy animals in shelters are eventually destroyed). He went on to replicate the feat in upstate New York. In this case, many people in established agencies actively resisted the initiative, and many others in these locales simply didn't care. The ones who did care made the difference.
Thinking about this I can see many parallels in business and online community, in things I see or do every day. Take Avinash Kaushik's common-sense exhortation that "aggregate" is never the name of your website visitor. If you get bogged down in aggregate statistics, you might be overwhelmed with just how many loosely-engaged, "valueless" clickthroughs come to your website every week. Yes, but why not make an exercise out of ignoring the 80% of people who aren't connecting with you and zero in on just the 20% of those who do? Study their characteristics. Build and grow with them. (And it's easier than ever to study them. This week, using Google Analytics' custom segment features, I hand-built a segment called "engaged quintile," for the 20% of website visitors on a client's site that stayed a long time and viewed many pages. By definition, guess what the "bounce rate" was for that segment? Yes, it was 0%! It's heartening and inspiring when you look at life through that lens.)
I've also been noticing how some online communities take off, and others have much more trouble doing so. I think that's because we often wildly overestimate the importance of numbers, and underestimate the importance of engagement, as Godin says. Even in communities that are supposedly in a niche, we wind up with a mass, semi-engaged group that never takes anything and runs with it. One online community of 2,000 people (sounds like a niche, right?) spins its wheels and goes nowhere, while another of the same exact size takes off. The second one doesn't take off because 2,000 people are doing stuff. It takes off because the core leadership is making this place their main mission in life, and an outer core of maybe 20-30 enthusiasts make it their mission, too. Most everyone else joins in over time because of that passion displayed by 20-30 people. Momentum starts with leadership, and a tightness and clarity of mission that breeds passion.
The first group is doing what Godin calls sheepwalking. The second is on a real mission.
This is also why broad online content plays like Yelp take so much money to get off the ground (and why InsiderPages, Judy's Book, OurFaves, and maybe MojoPages are fighting an uphill battle, or have already run aground). They're very wide, so nothing is happening in a lot of parts of the network. Yelp managed to roll ahead despite their breadth because of the passions of Yelpers, and the relentless effort to build community offline as well as online.
But other, similar communities just have a much tougher time of it. It goes back to another Godin recommendation, from Unleashing the Ideavirus or perhaps somewhere else, to pick a hive you can dominate (read: a genuine, small, focused niche). Weak passion spread across a wide field provides little sense of mission.
TripAdvisor got lucky, and maybe fooled quite a few copycats into thinking they succeeded because they were in a "niche". Sure, travel was a "niche" in 2001. But it's actually far too wide for most aspiring community builders to attempt today.
If you talk to a venture capitalist about a user-generated content "idea", they might ask you about what major cities you plan to roll out in. I hope you expect to raise rounds B and C to the tune of $20 million or more in total, because on a shoestring ($1-2 million) budget, you can dominate maybe one or two major markets, if you throw all of your passion and resources at them. Angie's List started off in Columbus, OH, and expanded slowly. They kept in business for 17 years before some Web 2.0 guys decided to pony up that big $20 million+ to help that business "go big." But the majority of the action is in the hives Angie's List was able to dominate: perhaps six or seven midsized markets. Few large ones. Many markets remain weak, with few homeowner reviews, memberships, etc. And Angie's List is considered to be wildly successful, relative to competitors.
What if you were actually starting on a shoestring as Angie Hicks did, with less than $1 million in capital? Probably you would be best off finding passionate groups of homeowners in midsized markets, and trying to get passionate leadership growing those markets. Try doing the same in a half dozen even smaller markets, even, in metro areas of half a million people. That's sort of how Angie did it, and that's how she grew passionate followings (tribes) of homeowners in the markets she does dominate.
Eventually, building a brand in that way, national dominance is indeed possible. But for the time being, forget New York... unless you really know what you're doing.
Amazingly, Angie's recent round of investors expected some of the proceeds to be used for European expansion. Insane. If you're the size of eBay or Amazon, that makes sense. It makes no sense for a company of mid-sized American tribes like Angie's List.
Meanwhile, I'm working with a niche (very small) ecommerce site that also focuses on community. It is in only one segment of travel, evaluating a particular type of vacation. On this site, in spite of the fact that they are open about the fact that they're trying to sell you a trip, the visitors are extremely engaged. Passionate people are writing reviews. And the "engaged quintile" is spending 10-15 minutes on site, on average looking at 14-15 pages on the site (regardless of their geographic locale or other characteristics). (And did I mention that 0% bounce rate in that quintile?) I believe the reason for the surprising level of engagement with this startup is that they truly understand what a passionate niche is today, and they mean to provide true leadership to that niche.
A niche like "travel," or "bars and restaurants" sounds like a "niche" to the uninitiated, but unless you've got bottomless pots of money, remarkable and unflinching leadership, a core of 20-30 passionate co-leaders, and a really different hook, those "niches" today are just far too broad to maintain and grow a passionate tribe.
It's interesting that we were able to reach all of these conclusions without a single mention of technology, or "feature sets," isn't it? Makes ya think. At Yelp, Angie's List, Epicurious, Craigslist, and Flickr, the technology ranges from clunky to cutting-edge. But it's always in service of the tribe and its passions.
I just got out of one of the main panels of the day - the "All-Star Analytics Team" Orion Panel (get it, they're stars), which included Bryan Eisenberg and Jim Sterne and several other distinguished voices in the field of "Measuring Success."
Chatting with our multiloquous co-chair Kevin Ryan (and moderator of the session) afterwards I sensed that he is concerned about the entertainment value of SES sessions, especially the big ones held in the Auditorium here at the Business Design Centre in Islington. Analytics... entertainment... tall order. Still, he gave it a mighty shot with an initial reference to deal: he'll go back to America and convince Britney to stop speaking in a fake British accent when in shops, if London folks will stop the terrible fixation with 80's fashion on today's runways. Kevin, might I add, you might want to add a sweetener: talk that PBS-tote-bag carrying "Britophile" from the midwest out of crazy conversations with owners of local laundromats (viz. June 2007, Keswick) -- "Do you still serve teeeaaa? Isn't that graaand!!!" (Imagine this in a Chicago accent, at the top of her lungs.)
Anyway, off topic, as they say on the Internet.
The analytics all-star squad made a number of interesting points.
On accuracy: (In the sense of getting your ad network stats to match your analytics reports exactly, etc.) The consensus around the table (at least, in the easy chairs) was that "total accuracy can't happen." A range of user issues, including increasing privacy concerns (users deleting cookies, for example), drives difficulty in measurement. Beyond this, panelists (particularly Eisenberg) added later that the problem is a "lack of standards about what constitutes a session," or a click, or a visit.
On action: After all these years, corporations are stuck in the "collecting a lot of data, but not acting on it" mode. This theme came out repeatedly. Eisenberg made the point that if he were made to choose only one tool - Google Website Optimizer or Google Analytics - he'd choose Optimizer, because it means you're actively doing something to improve the user experience on the site. I share Bryan's enthusiasm for Optimizer.
On the whole: I have to say it doesn't look too good for third-party analytics providers. All panelists made polite noises that there is still a role for more costly analytics tools, but doing the obvious math, most weren't shying away from the conclusion -- derived from the triple whammy of Google & Microsoft offering a free product, typical users not coming close to using the full feature sets of advanced products, and anecdotal comments about the lack of support from third-party vendors -- that your "dollars" are best spent with a free product.
On social media measurement: This was a highly prosaic conversation, repeatedly returning to the nitty gritty of how to measure the traffic and sales impact of mentions in social media. In the midst of a discussion of tools that can measure the influence of key bloggers, even to the extent of their highly-trafficked discussions creating more discussions, a panelist interjected "What's the dollar value of a discussion?" Perhaps reinforcing the point, probably agreeing with the question, the panel went silent on this point.
However, another argument was made that companies need to start doing a better job of tracking their online reputations - figuring out whether what's being said about them online is positive or negative. Eisenberg wisely argued: "This is not about measurement but about organizations and their capacity to manage this changing realm of reputation. They find it hard to do this because they're stuck in an old-world broadcast model."
News just crossed my desk that Quantcast has secured $20 million in Series B financing. They're in the web audience measurement space that broadly includes leading players like Hitwise and comScore, but also upstarts like Alexa and Compete.com. Seems like a lot of analytics players want you to slap a little code on their website, and there isn't much of a revenue stream there yet, so isn't that a pretty big vote of confidence for yet another player in this space?
Well, they've got a lot of sites signed up to put that code on their site. So, in spite of the apparently speculative nature of business models like this, those putting in the cash must be doing so because they're betting on the value of rich user data. With Microsoft's new analytics platform aiming to add demographic info to the usual analytics mix, and players like Quantcast coming onstream, it looks like the business of deep web audience measurement is really heating up. That can only be good news for advertisers and publishers, but it will also raise ongoing privacy concerns, no doubt. For example, we have slapped the Quantcast code on this blog, but I'll bet you didn't know it. Well, now you do. A lot of bloggers like us don't spend a lot of time updating privacy policies and such, but at least that much, I've disclosed. For more, click "View...Page Source" on your browser.
If you're wondering about the development of the web analytics industry, and how we got so far, so fast, you might want to take a look at Jim Sterne's History of the E-Metrics Summits (reaching back to the proto-days all the way back to his traveling the world talking about the Internet back in 1993). The explosion of this event into a multi-track three-day event has really opened my eyes -- I'll be speaking here in October, but at least as exciting for me is looking forward to attending many sessions.
The trick will be, how to manage the rapid growth of the field while still maintaining that feeling of "a few pioneers in the lobby bar talking about the future". At this rate, we might have to kick a few people out of the lobby bar just to maintain that feeling.
Got this promo for an upcoming conference exhibit:
With a suite of search-related tools, and a dedicated group of industry solutions analysts, Nielsen//NetRatings can help you determine if your search strategy is working for you. Find out:
Your click-through rates on sponsored and organic search links
Your buyer conversion rate
Who's driving the most traffic to your Web site
Can you say any basic logfile analyzer (free) or more sophisticated product along those lines (ClickTracks)... to the tune of 100's of vendors dating back to the mid-1990's? Surely you'll have to explain why yours is better? Different?