Warning: Declaration of Thesis_Comment::start_lvl(&$output, $depth, $args) should be compatible with Walker::start_lvl(&$output, $depth = 0, $args = Array) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 211

Warning: Declaration of Thesis_Comment::end_lvl(&$output, $depth, $args) should be compatible with Walker::end_lvl(&$output, $depth = 0, $args = Array) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 227

Warning: Declaration of Thesis_Comment::start_el(&$output, $comment, $depth, $args) should be compatible with Walker::start_el(&$output, $object, $depth = 0, $args = Array, $current_object_id = 0) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 244

Warning: Declaration of Thesis_Comment::end_el(&$output, $comment, $depth, $args) should be compatible with Walker::end_el(&$output, $object, $depth = 0, $args = Array) in /home/glynhopkins/1MD66W10/htdocs/ninjasuite/wp-content/themes/thesis_151/thesis_151/lib/functions/comments.php on line 293
Partial match penalty recovery – the promised land!!

Partial match penalty recovery – the promised land!!

by G on October 25, 2013

Hope And Aspirations

With one client I have been following their re-inclusion in the Google search index after they received a partial match manual penalty. I feel that this is one of the vaguest and most difficult penalties to try and recover from because there is so little clue provided as to where the problem might lie. It has taken 6 months and upwards of thirty reconsideration requests to see the manual penalty revoked; with the impact on the website rankings having been almost immediate.

History and Approach

The website in question had a long history of organic search positions that during the course of its lifetime had moved through a number of different search engine optimization (SEO) suppliers, implementing the industry standards of the day [read: lots of link filth in the profile]. After establishing that traffic drop-offs coincided with a key refresh date of Penguin and discrepancies between the number of URLs submitted in the sitemap, and those indexed by Google, it was time to get to work.

According to industry the solution was to upload a disavow file – declaring “I think these links are crap, so please [Google] don’t give them any consideration attempting to work out whether my website is authoritative or not in its niche”.

However my client can’t survive on a “wait and see” and so, after consulting with them, we agreed to submit a reconsideration request to see what Google had to say.  It’s worth mentioning that during the course of this whole process Google made a change in Google Webmaster Tools to start providing webmasters with additional information about any penalties that had been applied to the website. Thus, during the link audit I learned that a partial match penalty had been applied to the website, for the quality of links pointing to it.

Tools you won’t need!

There are lots of tools to help with back-link analysis. Some of the old timers on the market include Majestic SEO, which is used a lot by industry for their back-links index, and by mutating its core functionality you can get to have some real fun with it. Frankly though you might do better to buy yourself a virtual server, hire a coder and then start building your own database of the web. This also extends to services such as Link-Detox – a service that literally popped up in parallel with the disavow file.  As we’re talking about back-links tool, my own personal preference is SEO Spyglass (Free Trial) simply because it’s a flat fee for the software and then a small fee for updates. I hope to get onto sharing some tool tech in a future post, but at the moment there is a lot of news in SEO so I’ll keep writing about that.

Having been through this entire process I think that the only tool you need to audit your links are those that Google themselves have made available to you via the Webmaster Tools account. Matt Cutts recently said that they’d be giving webmasters more links to analyze, so don’t waste your money on expensive monthly recurring fee tools unless you are a web-agency that can offset the high monthly premiums against other clients and the additional services of these providers. Now…back to Auditing

Your Google Kwality [sic] reviewer is unlikely to have access to the tools I’ve mentioned or, for that matter, the know-how on how to use them.  They will most likely be given review guidelines based on the data that both the webmaster and the reviewer has available in their webmaster tools account – otherwise it would not be fair.

Stage 1: Who’s sending the crap?

First stage was to look at the worst offenders. In WMT Google currently shows those domains that are linking the most heavily to your website. I noticed that there were three domains that were bringing in some 20,000 links to the website. When I went and looked at those pages I found that what had happened was what I call

“The Adsense Poo”

A webmaster had scraped Google search results using medium/low volume long-tail keywords capturing all the first pages of results. These websites, due to the nature of the search query, were showing snippets text that was pretty random (remember random = unique in the eyes of spider). They had then created pages which were simply an Adsense block at the top of the page, followed by a list of links below it*. There were literally thousands of them. I wrote emails to the webmasters concerned that were naturally unwilling to explain to me the function of these pages, but did, to their credit remove the majority of them. Now, had Google told me the websites that were contributing to the partial match penalty; the client would have been able to issue a claim against the webmaster for loss of sales. But sadly, this information is still not available in WMT so the client could only be grateful that the webmaster removed the links.
*It’s worth mentioning that at pubcon in Vegas this year, Matt Cutts hinted at the fact that Adheavy pages – perhaps like these -would be something that Google was going to investigate.

Each time actions such as the one above were taken, a further reconsideration request was sent via WMT updating on the situation.

The first 4 or 5 reconsideration requests all were returned with the same generic answer – “You still got problems there son”

Stage 2: Disavow

I started to analyze domains and create a disavow file. I never used anything other than the disavow fomats:

domain: domain.com

The first disavow file had a few hundred domains to remove, and this went through 5 iterations over 6 months. After each submission a new reconsideration request was made. The replies changed from standard “problems there son” to specific citations of problem URLS.  These links went back up to five years. Nothing was safe. As this site had a long history of web exposure, it was perfectly possible that it was going to be impossible to capture all the stray URLs.

You get to a point where you have to take a step back and say, however brutal it might be: What options do I have?

1) completely bin the web-domain and start a new one from scratch

2) get on your knees and pray and wait for a clear day.

If you have the luxury to be able to do a re-brand now is the time to do so because a large segment of the industry, have penalties in place and a few of these don’t even understand why they are no longer on the first page!

And this is not because their web-suppliers were overly naughty, but because the link-policy of Google went retroactive on the industry and caught suppliers with their pants down.

Stage 3: Hadoken Disavow

In the end our final proposal was to disavow literally everything that Google had on its’ books for the web-domain. I saw 98% of the links of the client had amassed over nearly a decade wiped off the slate. Google said to this: “You are now good enough to come back to the fold”.

What’s funny is this: if a website has its links wiped out they will lose their rankings on the keywords that they were doing well for in the past. However, if you have a large website of more 100K pages it’s still worth it – the tradeoff between the long-tail you will start to receive again and the big delivering keywords that your web-marketing company will now need to work at again to get back on the radar (and realistically, most of the primary keywords are cluttered up with Adwords so you’ll get a dribble of traffic even if you are in position #1 for it organically) will be worth it in the end.

There is also an added bonus, while performing this intensive audit you may, like I did for this client, find a whole heap of problems that when fixed will make the site even better for the end user, and search engines at the same time.

Step 4: Results

  • Manual penalty has been removed
  • Indexing of the website pages has increased.



Get Insights like these delivered to your inbox as soon as they are published!!

* indicates required

Author Information
Glyn S. H. has been online marketing since 1999 and has developed campaigns for leading luxury brands that have included Nestlè and Interflora . He works primarily in for the Travel and Tourism sector, helping hotels beat-down OTA paychecks. He has a web-marketing company, a Masters in Professional Communication, speaks fluent Italian, and is married with two kids. He also has a good sense of humour – essential for survival in web-marketing. He is not employed by Google. To contact via email: glyn@ (this domain).


Leave a Comment

Previous post:

Next post: