Identifying bad backlinks has become easier over the past few years with better tool sets, bigger link indexes, and increased knowledge, but for many in our industry it's still crudely implemented. While the ideal scenario would be to have a professional poring over your link profile and combing each link one-by-one for concerns, for many webmasters that's just too expensive (and, frankly, overkill).
I'm going to walk through a simple methodology using Link Explorer and Excel (although you could do this with Google Sheets just as easily) to combine together the power of Moz Link Explorer, Keyword Explorer Lists, and finally Link Lists to do a comprehensive link audit.
There are several components involved in determining whether a link is "bad" and should potentially be removed. Ultimately, we want to be able to measure the riskiness of the link (how likely is Google to flag the link as manipulative and how much do we depend on the link for value). Let me address three common factors used by SEOs to determine this score:
There are a handful of metrics in our industry that are readily available to help point out concerning backlinks. The two that come to mind most often are Moz Spam Score and Majestic Trust Flow (or, better yet, the difference between Citation Flow and Trust Flow). These two scores actually work quite differently. Moz's Spam Score predicts the likelihood a domain is banned or penalized based on certain site features. Majestic Trust Flow determines the trustworthiness of a domain or page based on the quality of links pointing to it. While calculated quite differently, the goal is to help webmasters identify which sites are trustworthy and which are not. However, while these are a good starting point, they aren't sufficient on their own to give you a clear picture of whether a link is good or bad.
Anchor text manipulation:
One of the first things an SEO learns is that using valuable anchor text can help increase your rankings. The very next thing they learn is that using valuable anchor text can bring on a penalty. The reason for this is pretty clear: the likelihood a webmaster will give you valuable anchor text out of the goodness of their heart is very rare, so over-optimization sticks out like a sore thumb. So, how do we measure anchor text manipulation? If we look at anchor text with our own eyes, this seems to be rather intuitive, but there's a better way to do it in an automated, at-scale fashion that will allow us to better judge links.
Finally, low-authority links — especially when you would expect higher authority based on the domain — are concerning. A good link should come from an internally well-linked page on a site. If the difference between the Domain Authority and Page Authority is very high, it can be a concern. It isn't a strong signal, but it is one worth looking at. This is especially obvious in certain types of spam, like paginated comment spam or forum profile spam.
Customer support service by UserEcho