This is the subhead for the blog post

Once upon a time, you could make thousands of dollars a month in search engine marketing simply by finding the “long tail” keywords that your competitors had overlooked. This usually meant one of three things:

1. Misspellings: “mortage rates”

2. Run-ons: “mortgagerates”

3. Multi-word phrases: “san mateo county bad credit mortgage rate loan offers”

These words were veritable goldmines. While your competitors fought tooth and nail just to show up on the first page of search results for “mortgage rates”, you often found yourself alone (and with a bid of $.10) on the tail.

As a result of this phenomenon, SEM experts spent a lot of their time focusing on keyword research, perhaps at the expense of ad copy optimization, landing page optimization, and process automation. But who could blame them – they were taking advantage of market inefficiencies and making a lot of profits!

These days, the value of the tail has diminished, if not disappeared entirely. There are three primary reasons for this:

1. The Search Engines are Smarter: All of the search engines now have “broad matching” or “advanced matching” features. This means that a big competitor that bought the keyword “mortgage rates” will still likely show up when a user types in “best mortgage rates” or “discount online mortgage rates.” The rationale behind this – from the search engine perspective – is fairly obvious; by populating obscure long-tail searches with results from very competitive keywords, the bids increase dramatically (no more $.10 clicks on the long-tail).

It used to be that the search engines tried to balance their drive for additional revenue with user experience concerns by looking at “token matching.” Think of a token as a word. A phrase like “bad credit mortgage loans” has four tokens. The old rule was as follows: if a generic phrase matches at least 50% of the tokens in a long-tail phrase, the search engine would consider that relevant enough to show the broad match results on the long-tail phrase.

So, in the “bad credit mortgage loans” example above, “mortgage loans” and “bad credit loans” would be broad-matched, but “mortgage” and “loans” would not (because they only have 25% of the tokens).

Based on what I see in the market today, this is no longer the case. The search engines have basically moved away from the “token-matching” system and are getting closer and closer to a “category-based matching” approach. In this model, a broad phrase like “mortgage rates” could be matched not only on “bad credit mortgage rates” but on “home loan rates”, “find a mortgage”, and “san mateo county loans for new home buyers.”

One other related point worth noting. In the olden days, even if a broad match had a 50%+ token match and did show up alongside long-tail searches, it had to compete in the same auction for top ranking.

Here’s what I mean: let’s say that you paid $5 CPC on Google for the keyword “mortgage rates” and you elected to be broad-matched across all long-tail keywords Google thinks are relevant to you. Let’s say that I bought the keyword “alabama low mortgage rates” and I was willing to pay $2 CPC for this keyword.

When a user did a search for Alabama low mortgage rates, Google looked at two factors to determine ranking – maximum CPC and click-through rate (CTR). The key point here, however, is that Google looked at CTR on a keyword-specific basis. So, if your broad matched keyword had an overall CTR of 10%, but only a .5% CTR on the specific keyword “Alabama low mortgage rates”, Google would use the .5% CTR to determine ranking. So, if my $2 bid had a CTR of 10% (due to the fact that I would likely have highly-specific ad-text, and I might be an Alabama-specific mortgage lender),

my effective “cost per thousand” (CTRxCPC=eCPM) basis would be $200 and yours would only be $25. So, if we went head-to-head against each other, I would clearly show up more highly than you.

That is not the way things work today. As far as I can tell, Google has moved away from the keyword-specific auction model to more of an “overall CTR” auction model. In this new scenario, if your generic keyword has an overall CTR of 5% (but remember, only .5% on the specific keyword) and my specific keyword has a 10% CTR, the eCPMs would be $250 for you and $200 for me – you would show up first. The long and the short of it is that this new system decreases the relevance of the tail by enabling high-CPC generic keywords to outposition targeted keywords.

By changing “token matching” to “category matching” and by changing the concept of a “single auction” to an “overall auction”, the results may be slightly less specific-ads for the user, but virtually eliminate market inefficiencies (read: no more low CPCs) for the search engine.

Of course, the flip-side of that is that as user queries are consolidated into a few categories, the bulk of ad-generated traffic is consolidated into those advertisers who can pay top position for the most popular searches. Or, to put it another way, instead of entering into 10,000 auctions for 10,000 different keywords, advertisers are now entering into just a few auctions for those 10,000 keywords. If you want to get traffic, you have to win in one of these few auctions.

2. Competitors are Smarter: Even assuming that the search engines weren’t doing everything in their power to move generic keywords into the long-tail results, the advantage of huge keyword lists has been reduced simply because competitors have caught onto the tactic. There are now dozens of companies (Trellian, WordTracker, GoogSpy, BadNeighborhood) that offer reams of keywords for pennies a day. Some of these folks now even offer APIs so that you can directly send the keywords right from their databases to your search engine accounts.

On top of that, the search engines have realized that being open about keywords helps their bottom line. Both Yahoo and Google offer keyword suggestions (taken directly from either user queries or other advertiser’s keyword lists) at multiple points in the account set-up process.

Two years ago, perhaps 30% of any keyword market was relatively devoid of advertisers and presented great arbitrage opportunities. Today, my guess is that less than 10% of the keyword universe isn’t heavily saturated. Combine smarter advertisers, more advertisers, and better tools and tail keywords are much less valuable.

3. Consumers are smarter: Finally, let’s not forget about the consumer. Remember the growth of AskJeeves – the “natural language” search engine. Consumers loved the fact that you could type in a question like “Where can I find low mortgage rates?” and get relevant results. Of course, it turned out that this was basically a gimmick – the system simply excluded noise words like “Where can I” and found results based on “find low mortgage rates.” Consumers caught on to this, the novelty wore off, and now Ask has abandoned the “natural language” concept all-together and is just another search engine.

Apparently, consumers are getting smarter on Google and Yahoo as well. A friend of mine who has already poured through the millions of user queries inadvertently released by AOL, tells me that user “query length” (the number of tokens in a search) is getting shorter. This to me suggests that user recognize that you can basically get what you want on a search engine with a basic search, rather than a 20 word diatribe.

No doubt searchers will also gradually learn to eliminate noise words like “and” or “where”, as well as to stop typing in searches like into the search engines (did you know, by the way, that one of the top searches on Google is . . . “Google”? This will decline over time . . . I hope).

In any event, as searchers become smarter, the volume of long-tail searches seems to become smaller, thus reducing the need for huge effort into keyword research.


Combine search engines maximizing revenue by showing generic ad results, competitors who now understand the long-tail, and consumers who have a much better understanding of how search engines work, and I think it’s safe to say that the Holy Grail of search engine marketing is no longer the keyword. No doubt keyword lists are important, but you won’t be able to retire in Aruba just because you concatenated three adjectives, two cities names, and two suffixes.

This basically means two things: 1) that SEMers are going to have to get a lot smarter about the other elements of SEM – like ad text, landing pages, bid prices, analytics, and filtering and 2) that folks who survived on the long-tail and market inefficiencies are in trouble (especially when you add in Google’s Quality Score changes).

A few months back, I wrote about the similarities between SEM experts and eBay power sellers, the concept being that both eBay and SEM used to be easy for anyone to do but is now rapidly becoming the domain of specialized experts. The end of keywords is yet another example of this phenomenon for SEM.