When Penguins Attack
Published: May 23, 2012
Author: Joe Stanton
By Jonah Stein, guest author
Penguin is Google’s latest high-profile algorithm update aimed at “web spam.” Combined with last year’s Panda update, many site owners are angry at Google for cutting off their livelihood, taking away the traffic they “earned” and giving them an aversion to black-and-white animals. (A few are reporting nightmares about orcas, zebras and badgers.)
Before I jump into specific analysis about what Penguin is and how to try to address the problem, it is important to be absolutely certain Penguin, not something else, caused your traffic to drop.
Make sure Penguin is your problem
Start with your analytics, ONLY looking at organic traffic referred by Google. If your traffic took a hit starting on April 24th, you’re likely affected by Penguin. If the traffic started dropping on April 19th, you were likely hit by the latest Panda update. If your traffic got hit on some other day, perhaps you were caught up in one of the 50 or so OTHER changes Google announced for April. If your problems started before then and you still haven’t figured out what is going on, why haven’t you found a reliable SEO or joined SEOBook’s community for peer-reviewed SEO advice and discussion?
OK, so your site took a dive on April 24th, and you are absolutely certain that Penguin is to blame. Not so fast. As is generally the case with high-profile announcements of ranking changes, Google released multiple changes at the same time, which makes it very difficult to reverse-engineer what is Penguin and what is some other un-named nightmare that is stealing your traffic and affecting your livelihood. Rolling out multiple changes at once also allows Google to declare that “Penguin affected 3.1% of queries,” even if the pain experienced by site owners is far greater than that.
Evidence suggests that at the same time Google announced a crackdown on over-optimization, they also turned up the sensitivity on the over-optimized anchor text filter and changed to broad match.
The idea of over-optimized internal anchor text has been around for a long time, and if you have ever done any SERP profiling, you would already know that Google has been filtering pages that exceed a certain anchor text threshold since at least November of 2010. On April 24, Google turned down the threshold and started counting broad/phrase match instead of only filtering for exact match
Why am I so sure that traffic drops on 4/24 are not “just Penguin,” the (rightfully) skeptical SEO might ask?
1. Some sites and even keywords were pushed down less than 10 places in the SERP, while others dropped 100 or 200 spots. Google chooses very specific (and dramatic) measures like +50 or +100 for penalties, whereas normal algorithmic filters have unpredictable results but push down sites from 3-10 positions.
2. A discussion on SEORoundTable.com demonstrated that if you add “-aasdfg” (or –anything) to a query, you get results roughly the same as pre-Penguin for KWs that have dropped more than 100 spots. It appears occasionally to affect queries for KWs that dropped on a of couple spots, but not consistently.
3. In an experiment conducted on a test site that dropped 3-7 spots, we were able to restore rankings to an affected site simply by increasing the anchor text diversity to the homepage. Note that these new links were from relatively spammy sources and acquired quickly.
While this is certainly not a sustainable long-term strategy, it is clear evidence that some sites hit for anchor text over-optimization April 24 were not ONLY affected by Penguin because Matt Cutts says that Penguin is a Cron job. The penalty will be recalculated periodically and affected sites will recover ONLY after the calculation is run. If sites are able to recover rankings and traffic in between Penguin updates, it is safe to assume the cause was not Penguin.
4. Cutts went out of his way to suggest that some sites may not be penalized but actually seeing the results of Google discounting large groups of blog networks, directories, and other low-quality link sources. Matt doesn’t always tell the whole story, but there is likely some truth to what he says. He is saying Penguin is not the only change they made on the 24th.
5. Danny Sullivan reported from his interview with Matt Cutts that the 700,000 messages to publishers that Google sent out earlier this year were not about bad link networks. Nor were they all suddenly done on the same day. Rather, many sites have had both manual and algorithmic penalties attached to them over time but which were never revealed.
So what is Penguin?
In Google’s black and white worldview, Penguin penalizes sites that have incorporated manipulative linking practices designed to boost rankings in Google. In the nuanced view of a business owner, Google is penalizing sites that did what they had to in order to compete in an ecosystem dominated by a single search engine with an algorithm that rewarded site owners for specific marketing activities – building links. Specifically, building links with anchor text that include the keywords and phrases a site for which the site wanted to rank.
Penguin is all about links; if you’ve ever done one of the following, you may have been penalized – and/or should be worried about Penguin 2.0:
– Begged, borrowed, and otherwise hustled to get people to link to your site.
– Actively participated in forums with a signature file that contained links to your sites.
– Hired a link builder.
– Bought links.
– Participated in a blog network or other linking scheme.
– Run a link exchange program.
– Registered your site in hundreds of directories.
– Participated in article marketing, particularly article submission services and article “spinning” systems.
You need to understand the differences between a natural link profile and one that has been built. You also need to recognize that Google doesn’t apply these rules the same way for all sites and that older, established, trusted brand sites can generally engage in all of the below with significantly less risk.
1. Use of keywords in anchor text: Naturally occurring links rarely, if ever, have keyword in the anchor text. If you look at the profile of sites that have never engaged in link building, you will see that the anchors are mostly the site URL, image links, and brand and brand variations, along with “click here,” “more information,” and a variety of random words and phrases. If you look at the anchor text from a link-building campaign, you will see it dominated by target keywords.
2. Use of site-wide links: Backlinks that appear in the boilerplate of a site (header, footer, left/right rails) or otherwise show up on every page stand out like a magnesium flare on a moonless night. Although high-quality niche editorial sites like SEOBook.com, SearchEngineLand.com, and PPCAssociates.com will inevitably earn blog roll entries, these are the exception, not the rule, and they generally abide by these conditions:
– They are rarely organically given to ecommerce sites, insurance, finance, online degrees, etc.
– They do not include the highest value keyword for the target page as the anchor text
– You will very, very rarely see a site-wide link organically earned that has been placed in the header, footer, or other boilerplate locations.
3. Inclusion in directories, article sites, and blog networks: These are essentially links that would never have been built if it were not for the (real or perceived) SEO benefit. If you look at the link profiles of companies who have never hired link builders, you see these are almost entirely absent.
Penguin Feels Like Revenge
Site owners are now getting hit by what seems like a Catch-22; for years Google publicly preached their (evolving) guidelines while rewarding sites that invested in building links. Some of these link-building efforts are now coming back to haunt those sites that haven’t made the transition into a brand name.
Successful SEO inevitably requires some link development. Veteran SEO consultants and site owners knew that “editorially obtained links” and “high-quality content” were the “right” thing to do, but we also know how hard it is to obtain these links – especially as Google’s campaign against paid links spread fear and uncertainty among publishers.
We saw that a combination of shoddy/spamming link sources produced results in competitive niches. We also saw that Google kept moving the goalposts, so we kept evolving tactics. Hit counters worked until Google decided they were spam. Footer links to web designers, hosting companies, and SEO agencies worked until they got discounted. Badges worked until the Guardian exposé on how to get online advertising for free. Paid links, sponsored WordPress themes, guest books, forums, press releases, article marketing…every tactic would work for a while until Google discovered it was effective and tried to quash it.
With Penguin, sites are now being penalized for the cumulative debris of years of evolving link-building strategies using tactics that were previously successful and commonly considered necessary to survive.
How to club a Penguin
The “easy” answer is to clean up your link profile and try to get as many of these unnatural links as possible removed while getting lots of high-quality editorial links. It’s the same advice we have been giving for a decade. If you are tired of nightmares about Google enforcers named after cute black-and-white animals, the real answer is to evaluate if your business can survive for a while without this SEO traffic and work with your team and your agency to develop a sustainable strategy that focuses on content marketing, audience, and reach instead of a cooked-books recipe for how to game search engines.
Jonah Stein has 15 years of online marketing experience and is the founder of ItsTheROI, a San Francisco Search Engine Marketing Company that specializes in ROI-driven SEO Strategy. Jonah has spoken at numerous industry conferences including Search Engine Strategies, Search Marketing Expo (SMX), SMX Advanced, SIIA On Demand, the Kelsey Groups Ultimate Search Workshop and LT Pact. He also developed panels Virtual Blight for the Web 2.0 Summit and the Web 2.0 Expo. He has written for Context Web, Search Engine Land and SEO Book.