Sunday, October 28, 2007
Every so often, I sort out and clean up, er, "archived" past notes, schwag, and mementoes from past conferences and trips.
Doing this today, I came across a pack of mints from "AOL Search." Beside that, a lanyard from a company called Claria. Who are these people? Why would someone do business with them? It doesn't immediately spring to mind.
Yet another reminder that what seems important in the Internet economy at any given moment may not be as meritorious of breathless anticipation as it may appear at the time.
Labels: amnesiobilia, schwag-stalgia
Thursday, October 25, 2007
There's an unwritten rule in search marketing: when a Google update knocks the stuffing out of a bunch of sites that were unfairly ranking too high, you're not supposed to gloat if you came out unscathed. But for the Grace of GOOG, there go I, etc.
Even when the infamous Florida update had webmasters scurrying in circles, and we noticed a sharp uptick in interest in paid search opportunities, I only gloated mildly.
That's why I almost considered letting this latest assault by Google on the practices of link buying, link farming, and business models that constitute premeditated interlinking schemes by their very nature, slide by without comment. But the lessons learned by this latest cannot be emphasized enough. It's time to stop ignoring these things or treating them as episodic examples of Google's high-horse madness, and to begin realizing that they continue to take aim at rank improvement "schemes" in their role as consumer advocates, attempting to reflect legitimate real-world authority and usefulness, just as they do with their increasingly tough rules on the paid search side.
Sounding every bit like a woman with a clear conscience, Jill Whalen gloats a bit in her recent commentary about these developments, and resolves to get out the popcorn to watch things unfold.
Put plainly, the reason the majority of the search marketing world responds so ineffectually to such issues is because of tunnel vision. At the most general level of professionalism, many in the "agency world" will advocate "integrated marketing," "brand management," and other long-term views of marketing strategy. This is the furthest thing from the minds of many SEO hacks.
And granted, that's too high-concept and not appropriate to the work many search marketers do. However, I'd propose that to be effective, the hyper-focus on the details of ranking tactics needs to be brought back into a mid-level focus. More on this as we go, later in this post.
The fallout of Google's latest rejiggering has been fairly severe, if you go by PageRank. On one of the PageRank checkers I use, you can see the multiple datacenters, so you see the "old" PageRanks and the "new." Traffick.com, I had nary a worry about because of the long-term, stable way we gathered external mentions since launch in 1999. We're stable at 7.
A number of the blog networks have been hard hit, with sites like AutoBlog losing one or two notches in PageRank. I would have to assume that this would take a direct hit out of the pockets of blog network owners such as Nick Denton. Organic traffic from search engines is a free lunch to many private entrepreneurs like this. Quality content deserves search visibility, of course. The question is really how much. There is only so much search traffic in a given month, so every Google reassessment of ranking and weighting methods amounts to a zero-sum game of "who gets the available free referrals."
One well known search industry site, Search Engine Guide, clocks in with a drop of 6 to 4, at least if you believe this PR checker. In the old days, they often came in with an 8, which is very high. I'm not saying the current drop is justified - Google decides on that. But it is probably the case that the 8 was too high.
And yes, I realize that PageRank's only a rough guide to Google's opinion of your site's authority, after weeding out phony forms of authority as best as they can. Some of those hard hit are claiming they see "no drop in traffic" and speculate that "Google is putting on a show." In denial to the end? Could traffic drops be coming soon?
Dropping two notches in PageRank may not sound like much, but it could constitute a severe penalty because the scale is logarithmic. The difference between 4 and 6 is really significant, as any experienced site owner will tell you.
But the point here is not to single any one actor out. It's to point out that search marketing as a profession became so popular so fast that many actors assumed that popularity or prevalence equated to real marketing expertise. The momentum of the industry translated into a "everything's fine here" sensibility, and an insular view that more experienced marketers had nothing to teach us. We're taking over! (With bought links, keyword research, and meta tags, you're taking over?)
What it looks like, to draw upon some of the beautiful photos I picked out to dress up this post, is that search marketers were so sure of themselves that they turned into the fuzzy, wool-bearing little creatures below. Matt Cutts says bought links are bad? Selling PageRank is bad? Baaaa humbug! 1,000 of my friends agree with me, so Matt must be wrong, and possibly evil.
The thing about it is, if you're in high school and well compensated (if not well dressed), you and your peers don't look like sheep to one another. It's such a cool gig to have, you actually look like Heathers...
No, I did not mean those heathers, and plus there are still sheep grazing nearby... can someone please...
Right, that's better. Heathers.
These Heathers were so compellingly popular, it seemed sometimes like you'd want to do anything to be like them. Even if you were the aloof, independent, and lovely Winona Ryder. Stop, Winona! You're better off without them!
So, I'm here to make the case for a mid-level focus rather than a close-up view that a narrow set of tactics in a toolkit will give you any clear guide of what to do to improve your company's long term reach and connection with prospects. As Goldilocks (or Winona, before she began shoplifting) I'll suggest that for many search-focused professionals, the idea of "integrated marketing" is too high level to be practical; whereas pure old-school SEO tactics always get you into the same mess eventually. You're not VP Marketing at a Fortune 500, or any other type of lifer; nor are you going to get far in life if your only skill is to tweak an H1 tag. In between, there needs to be an integrated understanding of what makes customers and markets tick today, and how to put that together with a search visibility strategy. That entails a lot of detail work, but deciding on the appropriate types of campaign work will be more effective if it's done within a structured framework that recognizes Google, and other sites for visibility online, as the complex consumer advocates they are. Call it "integrated online attention-getting," if you will.
So whereas for a couple of years on the Page Zero site, we joked about that we don't do SEO at all, we realize that what we've developed for some clients (at the moment we call it SEO 2.0) is something that yes, we actually do, and will continue to do. A long-term focus on "integrated online attention-getting" means a sustained strategic implementation, with particular action items leading to detail work performed by the appropriate party (sometimes us). When we launch the new version of our consulting site in a couple of weeks, that's what we'll make clear to current and prospective clients.
If that's gloating, well... unwritten rules were made to be broken. The way we see it, our clients are not in business to sit around wondering when they'll be "Google-slapped."
Update: Nick Denton, publisher of Gawker Media, responded to this piece to note that "the Google demotion of link farms has only hit offending blog networks. Engadget and other Weblogs Inc blogs have taken a hit of a couple of points of PageRank. But Gawker sites, which are much more sparing in their linking, are unaffected by the latest change."
Labels: online pr, pagerank, pr, sem, seo
Tuesday, October 23, 2007
Low quality scores and other keyword delivery issues can be, to paraphrase Jackie Chiles, "exasperatin', disingenuous, disrespectful, and borderin' on mischievious!".
As expected, Google has announced increased transparency on what's causing low quality scores in the form of additional keyword information. Don't get too excited. The inner workings remain safely guarded.
Many of the examples of poor quality scores come with the diagnosis "this keyword isn't highly relevant." That's not too compelling, especially when a multi-word query involving the word "cell phone" is rated "OK" but the identical one using "PDA" is seen as "Poor." (The full phrases I'm referring to are equally relevant.) It's like a science experiment in progress based on limited data.
However, I think this probably can be a good starting point, in the sense that you might be able to rule out or rule in landing page or website issues as the source of your problem.
That is, if the messages you're seeing are 100% accurate. Can I get a tool to diagnose the diagnosis? I have trouble believing certain keywords aren't relevant, when they describe exactly what the service in question is, and match up well with the ad text. It's all a little too mysterious to be truly helpful, especially when the advice given alongside the diagnosis is "delete this keyword" (yep that's really what they say).
On another example, the keyword diagnosis isn't functional because the form of geographic targeting I'm using for that account (it's a radius of a large city metro area) isn't supported. So I'm left to wonder why "crÍpes" is "OK" but "crepes" is poor. Is Google a spelling snob!?
Increased transparency takes courage, and does encourage gripes like the one you're reading. In that sense, I applaud Google for rolling this out. Ship early and often is still a good policy in the software world... in spite of the uncharitable responses it sometimes elicits.
Labels: quality score
Wednesday, October 17, 2007
OK, maybe I'm not about to call for its "death" as I did with the keyword meta tag, but Duane Forrester's fine piece about Big SEO and automation just triggered a couple of morbid thoughts about our old friend "title tag".
Let's say you have a million pages. So you say you need SEO, eh? That sounds like it's going to be a mighty big job. Forrester correctly points out there isn't very much you can do manually. Although I would counter that you can work on between 500 and 2,000 pages to cover some pretty impressive ground, search-frequency-wise, if you're so inclined.
So what is on-page SEO, exactly. Is it adding appropriate titles, heading tags and headings, meta keyword tags and description tags, to all pages, thereby increasing their rank potential?
Let's work through the logic here. You're going to make sure certain "core" keywords appear multiple times in the document, "amplifying" their weight. But doesn't that just take us back to keyword density?
If the automation process involves "a way to automate the insertion of meta tag based on the actual content of a given page," as Forrester writes, then let's be clear on what's really happening: you're taking what's already on the page, and copying and pasting it into another page element.
If you do something similar for titles, the logical principle is no different.
Let's be honest. These various page elements and approaches to ranking content were mostly invented for a manual world. Logically speaking, if all you're doing to try to rank better (on a million pages at once) is to replicate some existing words within other elements of the page, you're adding only slight value, and zero additional meaning. It might be a good idea, but it's hardly life-changing for the user.
There is still some minimal value left. Well-labeled pages are easier to find and respond to, in that page titles appear in SERP's and in the browser.
You'll need to automate correctly to put keywords and meaning-related cues in the URL structure of the site, as well... but largely because this seems to matter to search engines.
But if that's all we've got, it's not clear that such pages should be ranking higher than their equals with less zealous automated efforts at keyword densification/replication on any given search query. In the case of scraper sites who are super good at this kind of automation, of course their well-constructed pages should not rank at all.
It's little wonder that based on such characterizations of SEO, many businesses view it as a purely technical function. It is not.
It's certainly a sad thing that a good CMS deployment (for example) can improve your overall level of search referrals, as compared with a bad one. Sad or not, it's a practical reality that companies need to study, at least until search engines get even smarter.
Still, there are plenty of other elements of information architecture that tend to get lost in such discussions. Should we use breadcrumb navigation or not? What's the right number of links in the nav bar to aid navigation? What approach should we take to site search? Should we add interactive capability to the site?
Overdo your efforts to please search engines alone, and you might not allocate the time and budget you need to please users. And happy users are the ones that spread the word so well, giving you the off-page love that is a prerequisite to high reputation and thus standing in the search engines.
Labels: cms, seo, title tags
Greg Sterling (via Matthew Ingram) provides full coverage of a brouhaha about a cafe in Oakland that stipulated "no Yelpers"! With Greg, of course I agree that online reviews are here to stay. Yet some business owners seem not to be able to deal with it.
In the field of online reviews I'm deeply involved with -- home renovations -- I don't know if any of you saw the 20/20 episode about the bad contractor in Maryland. He even went ballistic about the private online reviews shared among the membership of Angie's List. This contractor, who had defrauded a bunch of homeowners 16 years previously before being banned from doing business in a county, switched counties and began racking up complaints again. When customers began banding together and expressing their opinion, he became threatening, figuring that his bluster was going to turn out to be bigger than the whole phenomenon of consumer reviews. All that did was land him on national television, painted into a corner.
For businesses that want it all to go one way, there is hope. OurFaves.com, a Toronto-based Yelp-ish creation, encourages users to stick to the positive. At first I was sceptical. But you know what? It works. Most of what I want to post about local businesses is in fact positive, and the ones that go the extra mile, be it the drycleaner who undoes the problems the previous drycleaner foisted on me; be it the great unsung Persian restaurant at Richmond and Spadina, or 1,000 other great local spots... they need all the help they can get from customer advocates. OurFaves.com keeps it light and positive... and I admit, it is growing on me.
Labels: homestars, ourfaves, reviews, ugc, yelp
Tuesday, October 16, 2007
It'll take a bit of doing, but if you read this post by Yegge and this review by me of Calacanis talking to Danny Sullivan, you'll get it.
(What you should be looking for -- the fact that Calacanis makes "choo choo noises" when thinking.)
Labels: happy fun slander
Saturday, October 13, 2007
Some weekend (or Monday morning) reading: a fresh interview (the latest in our Innovators series) with John Marshall, formerly CEO of ClickTracks and now starting a new venture called Market Motive.
Labels: innovators, john marshall
Wednesday, October 10, 2007
Seen on the online comments accompanying big media news story:
"John Tory destroyed himself. There is no one else to blame, so stop using the liberal media as your escape goat."
It's tough to believe in the wisdom of crowds sometimes.
Monday, October 08, 2007
Mark Simon considers the probability of a "doomsday scenario" if Google badly misses a quarterly earnings estimate, which would hammer their stock price, casting a pall over the entire online advertising sector.
This is unlikely to happen, in my opinion, because Google are hedgehogs. (*) They continue to carefully mine their core field of advertising, and enjoy considerable diversification in that revenue stream due to the nature of the auction for keywords, etc. In that context, many of their other "failed" ventures are red herrings - not all that costly, and not threatening management's solid recognition about where their bread is really buttered.
Even if they do miss a quarter or two, it won't be by that much, for this reason. In fact, like many, I do (finally) expect to see some inevitable softness in Google revenues based on current economic events affecting the United States.
What if the stock price gets cut in half? Which seems unlikely, but let's say it does. That's hardly catastrophic in terms of overall valuation. That would "reduce" the market capitalization to $98 billion. The forward P/E would sink below 18, making it almost a value stock.
This particular doomsday scenario, then, is unlikely to unfold. In 1999, the marketplace for online advertising looked very different.
More unsettling could be that Google management decides to sink a huge amount of money into the infrastructure required to bet the company on something else. That kind of adventurism would make Google's financials look bad, but that would largely only affect shareholders in a game of musical chairs. This wouldn't carry over to other players, and might in fact create investment in really cool things that improve the overall standing of online companies by comparison with more traditional players.
(*) - reference: Jim Collins, Good to Great
Unfortunately for folks looking for loopholes to trick AdsBot, yep, it's the same thing on the paid side. They can only pay a few hundred Googlers to sit around thinking about your "intent," not a few thousand, so when it comes to assessing landing page quality, much of this has to be done by automated means. But despite limited resources, there is no guarantee AdsBot or an editor won't catch your intent. An editor can look at your account, its history, how it was set up, and who knows what else. They can look around at your site, your business history... and the cut of your jib. The ad program is a smaller universe. Google staff actually have time to pay attention to all that stuff, across that known universe.
Sometimes I'll look at the combination of an AdWords ad and the offer page or landing page, and think to myself: if I *didn't* work for this client or company, I'd think it looked pretty spammy, and the intrusiveness of the ad formats on the landing page (or some other factor) just *might* be getting us flagged by AdsBot.
It is much, much easier for Google to "watch" its advertisers than it is on the unpaid side. So again, while there is a huge need for automation and Google couldn't do its job without it, the real reason behind low quality scores that derive from poor site quality or landing page quality... is largely editorial.
Gray areas abound. Sleeping dogs sometimes lie. Sometimes you wake them, and they bark. Stay tuned.
Labels: paid search
Eric Enge's extensive discussion of hidden text and its dangers illustrates a key issue for anyone working on a search marketing strategy. (Hat tip seroundtable.com)
As much as you "might" escape sanctions from the Google indexing gods if you construct pages that "just look like" other, more spammy, pages, the reality is, if you go into the forest dressed up like a duck... it may not matter if you even quack like one, your danger rating goes up.
Basically: my personal philosophy on the SEO side is to dial back on excessive "on-page tactics" intended to give rankings that "extra boost." There are other ways to rank.
A particular SEO bugaboo for me is that "text way below the fold" technique. Fine if it's somewhat below the fold and it's navigational in nature. But not fine if it just looks cheesy and spammy. What "respectable" site would do that?
Search marketing is marketing first, and that involves a consistent, professional process for communicating with readers and customers. A comprehensive, analytical, patient approach *does* work. Creating more useful content *does* work. And above all, off-page stuff does the heavy lifting of enhancing your reputation and standing in the engines.
So back to why you'd use hidden text in the first place? Oh, I'm sure we can dream up all kinds of "legitimate" scenarios. Not pretending I play in this particular sandbox, the "illegitimate" scenarios involve low quality content being "thrown at" the search index while showing users something else. Whether they're gibberish pages users actually see, as opposed to gibberish hidden from users, and from there... gathering data on which of these two not only ranks in spite of Google's vigilance, and which leads to conversions to sales of porn or hot tubs... this would be the daily existence of the professional index spammer and the amateur index spammer-dabbler. If you're a real company, isn't it nice not to have to worry about those kinds of calculations? So if you are real, don't hire the amateur index spammer/dabbler person! A little knowledge residing in the brain of the business owner's nephew who built the site and knows "a lot about SEO"... can be a dangerous thing.
The bottom line? Quibbling about whether Google does or does not allow some specific sub-technique is not the way to go. It's not like they can give you "license" to work some "loophole". They use automated methods on both the paid and unpaid sides to flag violations. This in turn may trigger some human review, which can and will exercise editorial judgment as to intent. And as we've seen of late on the paid side, Google even makes official comments on "business models to avoid."
Google has been talking about intent for years. The spamsters don't want to hear it.
The webmaster forums may be loaded with folks trying to find out how to best spam Google with hidden text tricks they don't mind, or can't catch. But this misses the entire point. A human rater can look at your site and decide, based on criteria, that it falls into some category that is low quality in users' eyes, such as "thin affiliate." This can lead to low rankings, penalties, and banning. Even this system is highly imperfect because it still gives too much advantage to serial spammers and sophisticated cheaters. Something new is needed to rebalance things in favor of quality sites, even more so than today.
Creators of quality content will increasingly be rewarded through new ranking methods, in my opinion.
Site developers commenting on several legitimate uses of hidden text techniques (see the comments in Eric's seomoz post) just serve to emphasize the point that certain sites might fall into an *automated* net that flags certain deceptive techniques, but they do not deserve to. That just increases the load of human judgment on Google, or the importance of other (off-page) factors indicating quality and relevancy. Spammers *will* find ways of hiding text that Google simply does not want to work too hard to find algorithmically, as it would create too many false positives in any case.
Related: Matt Cutts on "The role of humans in Google search"
Funnily enough, then, after looking at it from all angles, the presence or absence of any but the most one-sidedly spammy hidden text techniques would appear to be a very weak signal of quality; one that Google cannot realistically weight very heavily for ranking purposes.
Labels: hidden text, seo
Sunday, October 07, 2007
Bless you, Steve Rubel. Interesting point that the big guys probably still win even if they can't build a competing social network. I get Facebook notifications through GMail, for example.
Labels: portals, social networking
Friday, October 05, 2007
Gone are the days when my mental picture of the community "out there" was shaped by the the hand-picked Letters to the Editor in the print newspaper.
Ever since newspapers like the Globe and Mail have admonished everyone to "join the conversation," I've noticed that the immediacy of online has led to a proliferation of sophomoric, hurtful, crude, and otherwise unhelpful contributions. And comment boards degenerate into flame wars. It's enough to remind one that our politicians are relatively polite and relatively intelligent when they yell at one another.
It looks like the media companies are encouraging this type of conversation, on one hand, and then pulling back on it when it gets out of hand.
A story on alleged sexual assault charges laid at a middle school started out with a long conversation thread on the National Post website. Subsequent syndicated versions of the story online seem to show no obvious conversation thread. The Globe and Mail, which features a "join the conversation" sidebar on dry stories like "Jobless Rate Hits 33-Year Low," is careful to avoid conversations about sensitive topics like this, apparently -- though positively giddy to encourage "debate" about whether thigh-high boots are appropriate office wear. But bending over backwards to avoid the conversation doesn't end the embarrassment for the Globe: the AdSense ads appearing next to the sexual assault story seem irrelevant and/or in poor taste.
Are the media companies picking and choosing when it's appropriate to converse?
Labels: online journalism
Wednesday, October 03, 2007
Followup to my recent article on Google AdWords' website quality policies. Although the majority of rank and file advertisers I chatted with favor Google's stances against, for example, "arbitrage sites that are designed for the sole purpose of showing ads," not all go along with the Google take on things.
One respondent, CEO of a midsized technology company, missed my deadline but took the trouble to call and leave a detailed voice message. His position explores the case for being "pro-arbitrage," on several counts:
- It's worrisome that our rights as advertisers to try different business models can shrink not because Google cares about users, but because Google is acting anti-competitively. What can be an official curb on "sites that are designed for the sole purpose of showing ads" today could in future bleed into banning "sites that show ads that Google just doesn't like, or competes with."
- So-called arbitrage sites are at the leading edge of user testing. Often, they convert better to a sale than so-called high quality sites, albeit requiring an extra click. In essence, arbitrage sites are the purest form of exploiting inefficiencies in the worth of media exposure. Remove this from the equation, and only less efficient forms of exploitation remain in the mix. This potentially weakens the rest of the herd as it is now being helped by enforcement as opposed to economic superiority.
The solution proposed by the observer taking the pro-arbitrage position? I'm not sure. It seems like more of a general reminder that Google's positions can be one-sided, and that they only selectively protect users from negative experiences.
- Google directly benefits from the ads showing on things like parked domains, many of whom show nothing but ad links. So Google listens to user complaints about being directed to such sites from a paid ad, but then again, they aren't above directly earning revenue from such sites through partnerships (DomainSense). It's a question of mixed messages, and also a holier-than-thou message in the sense that companies other than Google, who like Google profit from sites that pretty much just show ad links, will be hurt by the negative rhetoric surrounding "arbitrage" while Google, in fact, continues to earn revenue from stumble-in traffic to sites that look just like the ones they are supposedly protecting us from.
Labels: ad quality, click arbitrage
Tuesday, October 02, 2007
Google's North American president of advertising, Tim Armstrong, is quoted in a Globe and Mail article today as seeing a proliferation of ad agency jobs as a result of Google's dominance of the online ad market. Armstrong is in town to "meet with his Canadian team," according to the article.
Elsewhere in the article, Armstrong is quoted or paraphrased saying that "Google is working with retailers such as Home Depot Canada to try to boost its advertising presence by showing it ways that it can pitch more of its products for longer periods of time on the Internet."
No word from Armstrong on any agencies he might also see as qualified to deliver that message to the client.
Regardless, Armstrong agrees with Hotchkiss (and myself) that in Canada, "we're underinvested compared to what the opportunity is."
Meta-question: how does a private talk, an internal speech by a VP or President of a division, to a regional sales team, get full-length article treatment from a retailing reporter in the Globe? It's not a public speech, so someone got invited to be a fly on the wall, and to make it public. That's Microsoft territory. But as we all know... Google is indeed the new Microsoft.
Edit - Oct. 5: I'm told the interviews with the Globe and Mail were arranged outside the Google offices with Armstrong, following Google's internal meetings. Still, the interviews were talking about the private meetings (selectively).
Labels: google canada, tim armstrong
Making recent headlines has been a heated debate between a click fraud auditing vendor, Tom Cuthbert of Click Forensics, and Shuman Ghosemajumder, head of click quality at Google. Who's giving us the straight goods? Cuthbert, who has recently spearheaded an industry group called the Click Quality Council, claims click fraud is growing, and continues to stick by numbers like 10% and 15%. Ghosemajumder has repeatedly presented much lower numbers.
Both parties get into a bit of a side debate about the unique gclid modifier attached to every Google paid click. But to an outside observer, this does little to illuminate the patterns going on inside individual accounts; especially around Google's claim that they are proactively refunding virtually all invalid clicks.
You almost feel like you need a fourth-party auditor to help you audit the independent auditors.
The data I am seeing show that Google may (still) be closer to telling the truth than Cuthbert is. Google does proactively refund clicks; clicks in many accounts appear quite normal a high percentage of the time; there are some gray areas.
Interestingly, the data also show there are ways of managing a campaign (professionally vs. haphazardly), and choices you can make about which parameters and techniques to use, that will run you into more or less trouble.
And even when it comes to well-managed, normally running accounts, things can go awry. It's important to have some kind of monitoring tool for this - although you'd get a good cut at it with some of the other (non-click-audit-oriented, campaign management oriented) third party campaign management tools. A hard-working analyst using good web analytics could gain strong indications based on a granular assessment of "bounce rates" or time spent on site by keyword or ad group, but this methodology is weak and only really indicates that traffic is untargeted, potentially. Better than "bounce rates," as counseled by John Marshall ex of ClickTracks, is to look at "very short visits" (not the same as "bounce rate").
To help sort through the claims, I have been involved in beta testing with PPC Assurance, a product being developed by Richard Zwicky's team at Enquisite, a startup in the search analytics space.
In their interface, various screen types are available that illustrate your campaign patterns intuitively using a color coding scheme.
- Then, there is "green." Clicks you paid for and that fall into the terms of service you agreed to. The vast majority of clicks in any account fall into this category. There may be some irritating gamesmanship (competitors manually clicking, etc.) and some poor quality traffic inside that green area, to be sure, but Google is saying they also try to filter for that stuff. Ultimately the ROI on your campaign will tell you if "green" is putting enough green in your trousers.
- "Red" is bad. These are clicks you paid for, and that fell outside the terms of service you agreed to with Google. For some reason, even on this simple definition, many accounts have between 1% and 10% of this type of traffic. Even if this is getting up close to 4-5% you may need to look for a refund. But more importantly, you can use a tool like PPC Assurance to see when spikes occurred, on what keywords, from what geographic locales or problem IP's, etc. The information is so well packaged in their interface already, says Zwicky, that soon you'll be able to send a refund request with associated data, with a single click.
My next point should be reassuring to anyone who manages campaigns for a living. We compared a professionally managed campaign, one we have been working on for three years for a UK retailer, with a well-meaning, but amateurishly managed campaign. See screen shots below. (These were not hand-picked to make this point -- they were just two early sites in the PPC Assurance beta.)
- "Yellow" is central to this whole debate. These are clicks that fell outside the terms of service you agreed to, but that Google (Yahoo is coming soon in PPC Assurance, Zwicky assures us) did not charge you for. The first key to the yellow area is that you're going to be getting fairly accurate information that seems to dovetail with Google's own claims -- in fact, they are proactively refunding a lot of questionable clicks. But another thing you can do is to gain insight into click fraud patterns generally, without much effort. By looking at the "yellow" click data click by click (if you have time), you can see what kind of wacky behavior is going on out there. But no, you didn't pay for it.
The two screen shots below show first a "normal" campaign that had some click quality problems. Some underlying reasons for this include poor keyword selection and misunderstanding geographic targeting. It may also include reckless use of content targeting. Setting geographic targeting very tightly also places a difficult onus on the provider of the clicks, so campaigns that are local in nature can often run into apparent click quality problems because by definition you're asking for something the provider cannot deliver as accurately.
The next shot shows a "perfect ppc" or at least optimized paid search campaign. In these cases, the campaign was organized on sound principles of granularity, long tail, testing and managing to ROI objectives, careful control of content bidding, and an understanding of basic parameters and settings.
As you can see, the picture as shown by the second chart is not too bad, and unsurprisingly, the "perfect ppc" campaigns have made this client a lot of money over the past three years.
Disclosure: I have never been paid to talk about Enquisite or PPC Assurance. Their sister services firm, a search marketing firm called Metamend, is functionally separate from Enquisite, but Zwicky founded Metamend and remains involved. My firm, Page Zero, focuses on optimizing paid search campaigns, sometimes refers "SEO" business to Metamend, and vice-versa, and in some cases, we have collaborated on client projects. Metamend staff have imbibed with members of my company at the lobby of the San Jose Fairmont (I won't tell you what they drink -- it's embarrassing.) If I continue to believe that PPC Assurance is the best click auditing solution for clients, in future I will consider reselling the product for commission.
Labels: click fraud
Monday, October 01, 2007
Reggie Davis, the man in charge of click quality at Yahoo, posts an update, including the news that advertisers will be able to block unwanted domains from Yahoo's partner network. This mirrors a feature Google currently has in place.
As an aside, it's worth noting that Yahoo is now offering more proactive refunds based on poor quality partner traffic. Some of my clients saw some solid ones for August, bringing ROI into line where it should have been.
Yahoo is clearly coming around to the view that if you get ahead of these issues, you make more money and create less hassle in the long run. It's all about accountability breeding confidence in the channel.
Henry Blodget outlines how Skype has bombed and thinks maybe they should offload the asset to a company that gets it.
Well, I don't know. From a financial standpoint, Skype is clearly a failure on paper compared to some near-term targets that were set. But then again so were many telecommunications giants as they larded on debt, got reduced to junk bond status in some cases, before (sometimes) gaining powerful monopolistic advantages and turning the corner.
One of the key benefits of acquiring this popular technology - aside from the stated official synergies with eBay's business - turned out to be that it kept Google out of the "Skype business" a little longer. Google was doing a pretty bad job of succeeding at the "YouTube business," recall, so they just acquired YouTube. Problem solved.
Users in my circle of communicators are actually being pulled away from other IM applications towards Skype, because of the quality of its features on chat alone, to say nothing of it being the app of choice for many phone calls.
No doubt Google would *want* to take Skype off someone's hands at this point. Which is good reason to sell it, or good reason to hang onto it? Selling Skype now would only compound eBay's error, if indeed it was an error.
The biggest reason to (eventually) sell Skype is that it is non-strategic to eBay's business. Are they really in the communications field or building a kind of overarching dashboard of web functionality (as Microsoft, Google, and Yahoo are trying to do), or is this just sort of surface-level synergy?
So yes, I agree with Henry that these companies are a much more logical fit for Skype. eBay risks bombing again if they sell too hastily, though.
This story is uncharacteristically popping up on a few blogs that don't even publish that often, so I think it must be significant.
Gabe Rivera's Techmeme is going to publish a new Top 100 Blogs leaderboard, and this is being interpreted as a direct shot at Technorati, which has tried to rank blogs by authority (links).
Looking at some of the top sites on the Technorati list does confirm that the ranking methodology is kind of stale. The point that you can easily buy, beg, borrow (maybe not steal) links means that there needs to be a more subtle way of ranking a blog's standing. I'm not sure if Techmeme's method is going to be a huge advance, but it's the man of the hour for now.
I'd read this changing of the guard back into the discussion of how useful link analysis (aka PageRank) is to an overall approach to ranking websites or content in general. Overall I think the PageRank concept has degraded with time, and the final phase of rampant link buying and Googlers scolding people for link buying (and link buyers scolding Google right back) is silly season. The fact that I overhear leading SEO firms saying privately and cynically that "80% of what we do is buy links, for huge sums," pretty much guarantees that SEO won't look like that in a year's time.
To anyone other than a short-term tactician, stuff like this "boost your Technorati rank bootcamp" article is just plumb irritating. Must monetize blog, must get d-listers to link to me, must come up with nouveau version of link farm... arggghhhh.... have fun!!!
But what happens when Gabe's the new sheriff in town and you can't splog your way to easy cash?
I'll keep going out there and building authority for the sites that matter to me, but "thin" link building tactics have seen better days. It's interesting that a site like Techmeme, and its attempt to gain market leadership over Technorati, does such a good job of hammering that point home.
Labels: linking, pagerank, techmeme, technorati
Like a housing bubble, I'm seeing signs that the Facebook Over 35 Demographic hypergrowth may be reversing itself.
I watched with interest (and I admit a vague sense of panic) as dozens of colleagues signed up to dozens of new groups on Facebook, apparently prepared to fritter away even more of their time on "nice to have" interactions of a business-relevant but semi-social nature. I think a few of these will actually stick. For example, if the time is ripe for a renewal of your interest in the Electronic Frontier Foundation, vegetarianism, or The Syrahs of Chile, then Facebook's as good a place as any to do it.
But the flurry of "unsubscribes" is interesting to watch.
Looks like the Professional Marketers I consort with were in there kicking as many tires as they could, to get a sense of the platform and the possibilities. Then, the Time Management Thing kicked in, and they got, well, tired. And the hell out.
Labels: facebook fatigue
View Posts by Category