Unfortunately Google has never set the record straight on what exactly link devaluation is (I dug for 20 minutessite:google.com “link devaluation”), so we can only use causation theories based on our own experiences. In this document, it will be my own experiences so please feel free to point out any differences of opinion.
Link devaluation – the process of Google completely ignoring the existence, authority, and presence of a link.
Penalty – the process of punishing a domain in Google’s organic rankings by moving it to the later pages. This is normally done by a -10, -30, -50 downward movement in the SERPs. Sometimes a site is penalized for just one keyword, other times a page will get penalyzed, but mostly what we are seeing in the past few years are domain penalties.
In regards to link devaluation, there is a lot of speculation about how this process happens. One theory is the “manual devaluation.” In the manual devaluation theory, Google employees evaluate a domain and devalues it based on a quality score. Again, your domain is not getting devalued – the link that was pointing to your domain is getting devalued. This might cause a drop in rankings but in no way is that a penalty.
The other theory is the algorithmic theory whereby Google’s algorithm scans a domain using predefined quality guidelines and devalues a link based on the guidelines.
In some cases (theory based on personal experience) Google will trigger a penalty when a certain threshold of devalued links are met. For instance, if your site is constantly building links and those links are constantly getting devalued for low quality scores, this could trigger a manual or algorithmic penalty.
When speaking of a devalued link, we are talking about that specific domain to domain hypertext reference. What is not clear is that when a link is devalued, weather or not the domains total outbound link portfolio is devalued or just that specific relationship is devalued. Again, there has never been any whitepaper released on this subject by Google, so this is only theory.
The easiest way to see if you’ve received a manual penalty is you will have an “unnatural link warning” message in your Webmaster Tools inbox.
There is one quick way to distinguish a penalty vs a devaluation: time. If you are starting to see a decline in your rankings for a variety of different keywords then you are most likely experiencing a link devaluation. If you wake up one morning and all of your keywords dropped 10 pages, it is probably a penalty.
Client example – we’ve been in the proposal stage with a client for several weeks now for a business how has lost organic traffic. The owner and webmasters main complaint is that traffic has been down due to decreased rankings, but weren’t sure why or how. This client had an eCommerce site registered in 1996, and they’ve used half a dozen different SEO companies over the past few years. From their link portfolio, we can see that these SEO companies built all kinds of shady links such as spammed wiki sites, mass blog comments and article directories. After looking into their analytics, current ranking and webmaster tools we came to the conclusion that their recent drop in traffic was in fact due to the site dropping in organic rankings, but the site did not incur a penalty. The site dropped in rankings due to the links that were pointing to it being devalued, thus the juice that once flowed to their site that empowered it to rank, is no longer flowing. All signs show that the site is still in Google’s good graces, it just needs new link love.
Anyway, if you want to get a good idea of whether or not you have devalued links, navigate over to GWT and check your links. The links that you see in Webmaster Tools are the links that Google has “counted” or valued.. If you see this number starting to go down over time, then you are probably in the midst of your links being devalued.
Note: devaluation does not always happen in periods, 1 link might get devalued per year then 500 might get devalued 2 weeks in a row. It depends on your link velocity and Google’s index.
Updates or as some people like to call them “penalties” such as Penguin and Panda were not events, they were as I like to call them “patches” made to the algorithm that are refreshed from time to time (Google has referred to these updates as “algorithm changes in the past” . That being said, when an update is made to the algorithm it is most likely going to apply itself again and again over time. We saw a good example of this when Penguin was released, and then “refreshed” again several times throughout the next few months.
Some people might ask, “My rankings dropped recently, I must of had a penalty.”
Not necessarily. If there was a round of devalued links that happened over a short period of time, it might look like a penalty. What most likely happened is now that the links “no longer count” your site has lost authority (this can be measured using a number of tools such as “page and domain authority by seoMOZ’s suite of tools” and are then surpassed in the rankings by sites with more authority/value than yours.
This part is really simple, in order to keep your site free from link devaluations and out of the way of updates all you need to do is follow Google’s guidelines. They really do not ask much, I believe it is 8 pages printed out in total. Rand Fishkin stated in a Google+ post from last year:
“You can’t just remove the bad, you’ve got to also replace them with good ones. Stop thinking of penguin as a “penalty” and more of a change in what counts and what doesn’t.”
I actually have that post bookmarked in my toolbar because of how powerful that is.
You need to continue to build good properties/pages in order to gain better link opportunities. A few really obvious points: don’t ever automate backlinking, don’t create a ton of links within a short period of time, don’t create links on pages that have nothing to do with your site just because you can, don’t create links on “open” frameworks just because “they left it open.” Stay away from sitewide links. I’ve tested this on a number of sites and have found that it can end badly quicker than a non-sitewide, but that is another story.
June 2013 Update – Google has released some new information worth checking out all about penalties and backlinks.
If you are comparing your link portfolio with a tool like ahrefs vs the portfolio in Google Webmaster tools and you note that there are 500 links in GWT and 1000 in ahrefs, don’t take the difference and start freaking out. Also do not disavow those links, Google most likely did that for you already.
Google devalues links as a normal day to day part of their algorithm. We are now seeing Google’s search algorithm doing this a lot more on a continual basis as it is not feasible for them to constantly roll out new updates and do manual penalties and devaluations. This also makes it easier and less dramatic for webmasters when a fresh update gets rolled out.
What this means is that we will see less shocking and devastating, or as Matt Cutt’s put it: “jarring and jolting” penalties than we have in the past. That is not to say Google will not develop a new “algorithm change” that will massively effect rankings. We are basically stating that regarding links, the penalties or drop in rankings will happen on a constant on continual basis rather than all at once.
Update: Google has released a few more specific updates, including the update that targeted payday loan related websites.
It’s not difficult to see what spammers are up to lately. A lot of them are up to the same tricks: blog commenting en mass, forum linking, article posting (gasp), etc. Google will continue to treat these links the same. There are more sophisticated scripts being released that are designed to build links on niche-frameworks that appear to be legitimate links.
Prediction: Google will, (and probably already is) comparatively use disavow data from disavowed link submissions. Let’s say Google receives 100 disavow requests from 100 totally unrelated sites. That disavow data is then sent to a server where the data is compared, crunched and then merges/appends that data to a file. My prediction is those domains will then be scored on a scale of how many times it has been requested to be disavowed, and then sent for algorithmic processing to see if it meets strict quality guidelines based on a number of webmasters admission of it being a domain related to webspam.
I was going to cover some of the relationships between negative SEO and devaluations as well as the disavow tool and devaluations but we will save that for another day.
Thanks everyone for stopping by, please leave comments below.
We're glad you stopped by. We blog about SEO to contribute to the community, and to help educate those looking to learn more about it.
Part 1 - An SEO's Guide to Tumblr
Part 3 - The value of Tumblr links revisited