Wednesday, May 13, 2009
As widely reported in connection with their Searchology event, Google has introduced a feature to allow you to set the results to display "past 24 hours" (similar in a way to the feeling you would get searching Twitter, or Google News by recency). As the screen shot below shows, that same pane opens up a variety of other advanced search options. Once viewing recent results, you can switch the sorting mechanism away from "by relevancy," to "most recent."
In Marissa Mayer's and Jack Menzel's post about the new features, they also allude to the increased use of "rich snippets" to display review content graphically, for example. (Similar to Yahoo's Searchmonkey initiative.)
These developments open up a new type of search behavior - further solidifying the notion that many different users will see more and more different results pages with content differently ordered. Although not unfolding exactly as described long ago in these pages, the principle of users taking charge of the "algorithm" (or at least becoming more comfortable with displaying search results in a form that is more useful to them) is gradually taking hold.
An excerpt from Traffick's post in 2004:
So that's the future as this glassy-eyed pundit hopes to see it: a search engine that works like a sophisticated flight simulator, with a bunch of dials and instruments formerly available only to classified personnel. But to the extent that your settings become comfortable to you, it would be a flight simulator operated largely on autopilot. Now that would be one sweet ride!
Keep in mind, though, that at that time, Marissa Mayer flatly stated in a Q&A at SES that hardly any users wanted to use advanced features. What seems to have happened is that Google sometimes believes behavioral data about what people actually do, rather than looking at possibilities that don't show up in straightforward tests. As such, Google can be a reactive company despite its labs and vast resources. No one would debate that the current focus on real-time search and rich snippets was moved up on Google's agenda by the popularity of Twitter and Yelp, and flashes of innovation shown by competitors like (yes) Yahoo.
Reactive business, arguably, is smart business. Who, after all, could have predicted the Twitter phenomenon?
Takeaways for search marketers, aka business owners thinking about their search visibility:
1. Marissa is right: a small percentage of users access advanced search features. That will continue to be the case. 85% or more of searchers will continue just typing words into toolbars, the search box, or the address bar, and "play around." Only a small percentage will be using the advanced features.
1A. As such, fresh content and the like matters, but don't be obsessed with the idea purely for the algorithm's sake. 100% of users aren't going to be switching on the Twitterizer when they perform a search on Google.
2. That said, Google may begin to infer your preferences and turn those features on for you, or flash different types of content in oneboxes and so on. SERP's will look different for every user. The notion of where your company ranks on certain keywords becomes ever more fluid. More sophisticated assessments of search referral analytics are a must to gain insight into your user behavior (but then again, you'll need to think about how to gain insight into the users who aren't finding you, and figure out why).
3. A diversity of content production and community-facing PR strategies are needed to be seen by a variety of searchers. Algorithm-literal SEO strategies are dying. Comprehensive SEO strategies are on the rise. And in contrast to the awkwardly-named Orion Panel we're putting together for SES Toronto in June (Is PageRank Broken? The Future of Search), PageRank is not broken per se. It's just becoming increasingly irrelevant. (The name of the panel is my fault.)
4. The more things change... the more they stay the same
: The ads always seem to stay prominent, don't they? Google doesn't seem to be sweating about the revenue impact of making changes to how search functions.
Not a takeaway, but fascinating anyway: it's even more interesting today to ponder who is going to acquire Twitter (Microsoft or Google), or whether they can really stick it out alone. Does Google not want Twitter? Does Twitter not want Google? Does the world not want Twitter inside Google? Are they haggling on price or not talking? Are you as curious as I am?
Labels: algorithm, google
Thursday, March 05, 2009
Blogging hiatuses were made to be violated.
There's a lot of noise in the search space that I don't pass along, but this new information speaks to major substance in ranking philosophy.
Google has made an algorithmic tweak -- Matt Cutts dubs it "Vince's change". The upshot, as I interpret the back-and-forth between experts like Aaron Wall and experts at Google, is that Google does find it difficult to accurately assign trust and authority across this vast digital universe. (For some upcoming debate of that issue, check out the panel we've just posted for the upcoming SES Toronto show -- Is PageRank Broken? The Future of Search -- on June 9, 2009.)
The tweak, we assume, bumps up trusted sites slightly in many hotly-contested ranking showdowns. So as an example: VW.com is going to get a bit more traffic next week and next month in the aggregate, because many of their internal pages are going to outrank
Matt says that Google doesn't think brand when it thinks about quality and authority ("if we did, you'd see Mitsubishi Eclipse ranking #1 for [eclipse]"), but this is disingenous. Indirectly, when you take that VW example, they are thinking brand when they take a shortcut that calls the VW.com domain "known information" and put a higher threshold of "track record required" on pages of sites that aren't as known and trusted.
I believe this trend has already been in force, and it's good that Google is making it only a *slight* change, because -- particularly on sites loaded with user-generated content -- there is the potential for less useful and even spam pages to get ranked too highly by opportunists exploiting "trusted domains."
Whether or not this small tweak is consciously focused on trusting brands, or whether that is the end result, is inconsequential. But what stands out is that this -- as ever -- is essentially a workaround. It's a response to the problem that we cannot possibly have enough information to correctly rank pages in all cases. So this is a pragmatic way of making sure that algorithmic judgments are slightly more correct (or more satisfying to searchers), more often.
Danny makes an excellent addition to this story by recommending rankpulse.com as a way of checking whether key brands did see major ranking improvements on core terms like "airline tickets". If a brand "comes out of nowhere" to rank well, that's not quite as minor a change as Google suggests.
Among other things this may have practical search referral implications for naming conventions and URL's in large companies; microsite creation; and multi-brand strategies. That's a perennial question: should we keep fewer domains, or create more focus sites and interlink them, etc.? The debate just heated up.
Labels: algorithm, google search, pagerank
View Posts by Category