What you’re about to read is excerpted from an older, longer premium article. The excerpts provided below omit details and context provided by even older articles from which they were taken. Be careful not to infer meaning beyond what you see here.
I received the following question (reformatted for this article): “Michael, you’re convinced no one knows how to disavow properly. What would you disavow?”
Long-time subscribers to the newsletter may recall that I’ve shared my criteria for identifying spammy links in the past. Obviously to answer this question I’ll have to go down the list again. But first let me summarize what can happen with link spam:
The search engine never sees it, so no effect
The search engine indexes it w/o sufficient PageRank-like value, so no effect
The search engine indexes and accepts the spam, so the links pass value
The search engine rejects the links and never indexes them, so no effect
The search engine indexes the spam but doesn’t trust it, so the links DO NOT pass value
The search engine identifies a pattern and penalizes the destination
The search engine identifies the spam as such and assigns a NEGATIVE value
Item 3 is important. Spam sometimes helps. It still works in 2019 (Update: Yes, in 2023, too). Spam may always work.
To the best of my knowledge, Google has never explicitly stated that links can pass negative value but they’ve come close several times. I first openly speculated about “Negative PageRank” in 2011 after the Panda algorithm was released. [When I asked Matt Cutts about this, he pointedly said nothing – leaving us to speculate.]
With the Penguin 1.0 algorithm they went after “Home Page Backlinks”, specifically the blog networks that were publishing entire blog posts on the home page. That was a very easy pattern to identify. Google came up with that idea after the spam team manually tracked down and delisted thousands of paid HPBL networks in March and April 2012.
Penguins 2.0 and 3.0 went deeper into sites and apparently refined or expanded the patterns they were looking for. The fact that Google was relying on patterns means they implemented machine learning sometime in the process, probably with Penguin 1.0.
Up to this point, to recover from a Penguin downgrade you had to get rid of the links. In early 2012 I asked Matt Cutts why he wouldn’t give us a Disavow tool. He said he wasn’t sure it would be used correctly, or if there was even a real need for one. That was when I began talking about “toxic links” in earnest. A Toxic Link (by my definition) is one that the search engine has identified as bad and is using to punish the Website that was seeking to benefit from it.
Penguins 1.0-3.0 appear to have assigned Negative PageRank-like value to the links they identified. Hence, Penguin-marked links were Toxic Links. Unfortunately, people in the SEO community took “toxic links” to mean something else (or it evolved into something else). I rarely speak of “toxic links” any more because there really isn’t a need to.
Penguin 4.0 introduced a continuous evaluation of links (essentially a new document classifier that is run on every Web page as or soon after it is crawled). Penguin 4.0 also reversed the polarity. Instead of using the identified spam links to punish Websites they are simply ignored, dropped from the link graph.
Some people believed that Google may have “grandfathered” the old toxic links from Penguins 1.0-3.0 into the link graph. It could be that Google merely needed time to re-crawl the Web and apply Penguin 4.X to those links, and until that happened their negative values continued to suppress Website rankings. The Grandfather Effect was incidental, perhaps, if it wasn’t intentional.
So the Disavowals continued after Penguin 4.0 rolled out but the justification for them has declined. Over the past 2 years (Update: add another 3 years for 2023) as people continue to disavow links they become more likely to complain about negative effects of Google updates. In other words, when Google changes something, many (but by no means all) of the people who continued disavowing links find themselves losing traffic.
Without the good value-passing links that were disavowed, Websites become more volatile and unstable in the search index. It’s inevitable that they’ll experience more fluctuations in search visibility and rankings.
What Would I Disavow?
Notice that John Mueller advises people to preemptively disavow links they believe would lead to a manual action. He knows what would lead to a manual action. Anyone who is buying and selling links should also know what should lead to a manual action.
Everything else is acceptable.
But what about PBN (Private Blog Network) and GBN (Guest Blog Network) links? Won’t they lead to manual actions? In my opinion that depends on the PBN/GBN. If you’ve been operating a PBN for the last 5 years and have never been delisted or received a manual action notice, you have no reason to disavow your own links. They are still spam. Building private blog networks for search-influencing links is a link scheme, and that violates Google’s search engine guidelines.
But just because a link violates guidelines doesn’t mean the algorithms will see it as a violation (they’re not perfect). And if they do see it as a violation they may decide it’s not severe enough a violation to elevate it to the status of manual action.
At worst your unpenalized PBN (and GBN) links are passing no value (neither negative nor positive). At best they are passing positive value.
I see no reason to disavow in a situation like that.
On the other hand, if the destination site(s) received a manual action (an actual penalty) then I would go ahead and disavow the PBN/GBN links (or add “nofollow” attributes). One would need to do something before filing a reconsideration notice.
I would keep the sites online if they were sending non-search referrals. You’re not breaking any laws by spamming the search engines and if you neutralize the links Google doesn’t care what happens between your Websites. All they care about is the set of links they have identified as violating their guidelines.
Characteristics of Spammy Links
In Volume 6, Issue 33 (September 1, 2017) we published an article titled “Free Web Hosting for Links”. Here is the concluding section of that article:
In Volume 6, Bonus Issue 4 (December 29, 2017) we published an article titled “The Risks of Using Nofollow Links”. Here is the concluding section of that article:
Yes, people sometimes ask if they should disavow “nofollow” links. It’s a harmless time-wasting exercise.
In Volume 5, Issue 38 (October 14, 2016) we published an article titled “SEO Micro Case Study No. 132 (Marketing Land Podcast)”. We dissected the key takeaways from the podcast, which was an interview with Googler Gary Illyes. Here are excerpts from our article:
In Volume 3, Bonus Issue 4 (October 31, 2014) we published an article titled “SEO QUESTION NO. 177 (What is the Difference between Interferometry and Correlation Analysis?)” Part of the article dealt with trustworthy vs. untrustworthy links:
In Volume 4, Issue 11 (March 20, 2015) we published an article titled “SEO QUESTION NO. 215 (How to Decide if Something is Natural)”. Here is an excerpt from the article:
And another excerpt:
In Volume 6, Bonus Issue 2 (June 29, 2017) we published an article titled “Was There A June 25 Google Update?” Here is an excerpt from the article:
Coming back to the original question, “What would [Michael Martinez] disavow?” I would disavow anything that didn’t look natural. But natural links look really weird. All those crawling sites out there that publish meta information about your sites are creating natural links. All those aggregators that take your RSS feeds and publish excerpts of your articles are creating natural links. All those sitewide links whom people you’ve never heard of put into their blogrolls are natural links. All those links in forum discussions you were unaware of are natural links.
And yet these are most often the kinds of links people say they want to disavow.
Just because you feel a Website looks ugly and unprofessional in design doesn’t mean its links are spammy. Spammy sites may indeed be ugly and quickly built, but they are a step below the usual ugly fluff people fear. Ugly is okay. There is no search engine guideline that says a Website cannot or should not be ugly.
Conclusion
If you’re looking at a couple hundred 1-page Weebly Websites with spun content, you would probably be safe to disavow that stuff. Google should be able to identify obvious link spam as obvious link spam. Then again, if it’s still helping and you ain’t been penalized after all these years, why rush to judgment? Most likely your traffic woes are due to something else.
And yet if you can find thousands of links that were clearly intended to manipulate search results for any reason, then don’t wait for the penalty.
If you’re in doubt about what kinds of links to disavow or distrust, then in my opinion you should do nothing. But if you’re absolutely convinced you know why the links exist and it’s not because someone decided to link to 3 million Websites, then give it a shot.
Md Hridoy Hossain, a dynamic learner from Bangladesh, initially studied Zoology and Fisheries, then delved into Computer Science, specializing in Database and Computer Programming at Bangladesh Technical Education Board (BTEB).
Hridoy's diverse expertise spans SEO, Web Development, Digital Marketing, and Software Development, honed through various courses. He manages websites, creating SEO tools and engaging content, generating income via guest posts, AdSense, and affiliate marketing.
Across Facebook, Twitter, Instagram, LinkedIn, Pinterest, Reddit, YouTube, and Tumblr, Hridoy shares insights, educating and inspiring his audience. His continuous learning and entrepreneurial flair position him as a rising star in the digital realm. For inquiries or collaboration, reach out at hridoythebest@gmail.com.