Hi Everyone – As a firm, we’ve always looked at companies like SEOmoz for direction. From their top of the line software, extensive blog, branded “Whiteboard Friday’s” to the fact that they have actually created an SEO standard in the community – “Hey what’s your MOZ score” (referring to SEOmozs’ software rating system of SEO), SEOmoz is a leader in this industry. We’ve even added the word SEOmoz to our spell-check dictionary. :)
Rand Fishkin is the CEO of SEOmoz. He has written several books relating to SEO, has received several prestigious rewards, and has completely enmeshed himself in the world of SEO and internet marketing.
We were very pleased when we found out that he agreed to do an interview with us. Last week we reached out to him a few days after Google’s Penguin update for some answers, when we were all still scratching our heads with some of the things that happened. If you are in the industry, I know you are all going to appreciate this:
Patrick: Great to meet you, it’s great being in an industry where a small company like ours can connect with “the biggest” company out there. Already we are seeing company’s offering blackhat solutions to this update so I think it’s important that we continue to share healthy information about SEO and link building in particular, and that is why we’ve asked you to this interview.
Rand: Well, to be fair, we’re only 60 people and considerably smaller than the tech giants in most other fields, but I agree that the ease with which connections can be formed in the web marketing and startup fields are quite awesome. For a long time, I felt that information about SEO in particular, and inbound marketing more broadly, hasn’t been as accessible at it should be. That’s actually the reason I started the SEOmoz blog all those years ago, and it’s a big reason why I think it’s important to keep sharing information, data, tests and case studies.
So, thanks for having me!
Patrick: Getting right into it, fundamentally what do you think was Google’s goal with the April 24 algorithm update – i.e. do you think they were targeting affiliate marketers, do you think they wanted to improve quality search results, etc?
Rand: I generally believe Google when they say Penguin was about removing the value of manipulative links and penalizing some sites that had engaged in very obvious gray/black hat linking tactics. I don’t think there was a specific target around improving the quality of search results, though – which makes this one of the more interesting updates from a philosophical point of view. Google’s webspam team has long said that the ends don’t justify the means, i.e. just because you’ve got a great site or great content doesn’t mean they won’t penalize you if you’re engaging in manipulative practices.
Penguin’s one of the first major updates we’ve seen in a long time that isn’t directly tied to improving search, but rather, to enforcing the rules around link acquisition. I’m personally glad to see it, but I can understand why so many in the SEO world are frustrated. Google’s been relatively “hands-off” on bad links for years now, and it’s taken many by surprise. Naturally, there’s a tendency to resort to accusations of a conspiratorial or “doing evil” nature, but Google webspam’s merely doing what they’ve always said they would (and not in a particularly large-scale way; I’d be surprised if there weren’t more aggressive updates of this nature coming soon).
Patrick: We’ve heard that a lot of “branded” sites faired very well in this past update. Why do you think this is?
Rand: Brands have all the signals search engines want to better measure and interpret and they’re what Google wishes they could show more of. They don’t generally like affiliate sites, thin content sites, or anything that’s merely taking advantage of their ability to outrank brands because they’re more nimble or more able to execute on the intricacies of SEO. Google wants to rank the things that make their results look the most legitimate and feel the most intuitive and well-executed to their users. Brand familiarity is a huge part of that. Users trust brands. They don’t trust generics. They generally don’t trust keyword-match-domain names. We can complain about this all we want in the SEO field, but it’s not Google that needs to be swayed – it’s the consumers who search. Put relevant results from brands consumers know and trust in a search engine, and you get happier searchers who’ll search more often, click more results and make more purchases.
Brands also, of course, inherit a lot of the individual ranking elements Google likes: branded search volume, type-in and direct traffic, lots of editorially given white hat links, user & usage data signals, etc. As Google gets better at catching and filtering out manipulative signals and focuses more energy on the ranking inputs that better reflect the real world brands win. Like it or hate it, that’s the world we live in, and in my view, it’s far better to do everything in your power to become a brand rather than fight against the current.
Patrick: In relation to the Penguin update, what advice would you give to a very low competition site, (500 exact match local monthly searches or less) that has a city before or after its primary keyword? It’s tough for a site like this to build big authority links, especially with a small budget – what are some of the best ways to get quality links.
Rand: Content marketing has been a clear winner for a lot of small brands – blogs, research, linkbait, videos, particularly the kind that get very popular on social networking sites. I’d also strongly suggest a lot of participation in communities, both online and off. Companies that get active in their local region through meetups, sponsoring/hosting events, building relationships, etc. form a lot of natural links and bonds that generate authentic recommendations. Other suggestions for specific keyword links would be – embed-able content, online relationship building with every site/blogger/social media stalwart in your town and inclusion in any local lists possible (including old-media stuff, which often come with a link on the web).
Patrick: Do you believe that there is a sweet spot for having the correct amount of anchor text variation? It’s hard to build a “natural” link profile when we are manually adding links, what is the best way to mimic this?
Rand: I strongly suspect we’re looking at correlation and not causation here. A lot of anchor text keyword matching is a strong sign that there’s been some manipulative link acquisition, but I’m not convinced that by itself, it yields a penalty or devaluation (though it’s very possible it means someone at Google webspam will take a close look at you).
I’d also say that anytime you’re trying to “mimic” natural behavior, you’re probably in trouble (or will be in it soon). Natural link acquisition, even when done manually, should feel 100% organic and white hat. When it starts to feel like you’re manipulating things to “look natural” you’re in dangerous territory and need to re-think your tactics.
Patrick: Regarding the sites that got hit with a -100 or worse in Google, do you think that this is considered a penalty or backwards movement?
Rand: -100 is likely a penalty. Same with -30. If you fell 14 spots or 22 spots or even 64 spots, but still rank for obvious exact matches of your brand/title keywords on your pages, you’re probably not penalized, you’ve just had a lot of your backlinks lose their ability to contribute to rankings.
Patrick: We don’t like to “out” niches here, but since April 24 – we’ve been seeing a lot of “under-optimized” sites in the search engines appearing in the top slots – i.e. sites that have a zero backlink portfolio, and no evidence of on-page optimization. What do you think this is all about?
Rand: My strong guess is that when Google killed the value of a lot of manipulative links, they still have plenty of others that are still working. Thus, you’re seeing the second-level of spam rising up to the top using link tactics they didn’t catch (but may in the future). The fact that the third-party link indices haven’t/can’t see what they’re doing doesn’t mean they don’t have a lot of link equity from spammy sources (or redirected domains or hacks or the like). We’ve seen a lot of the most aggressive spam block Mozscape/OpenSiteExplorer/
Patrick: Matt Cutts didn’t say anything about the role that social media/social signals played in his blogspot post. We know that “social = good” but are there any lessons learned (regarding social) from this past update?
Rand: Social’s great for ranking for logged-in users (which are making up a larger and larger part of the pie) thanks to Google+ and SPYW (particularly Google’s incredibly powerful but rarely discussed social connections page. (October 2015 edit: this link now 404’d https://www.google.com/s2/u/0/search/social) Social also produces a ton of great white hat links that are “earned” through the attention that follows continuous success on social channels. As an example, here’s 90 pages linking to a URL I tweeted that Google’s indexed: https://www.google.
Patrick: The biggest problem that we’ve had with our SEO clients in this update, is feeling the wrath of Google as a result of links that were built prior to our contract with them, be it link packages they’ve bought on Fiverr or blackhat SEO companies. One site in particular has hundreds of international forum profile links with the same anchor text, and has now dropped 10-20 pages in the SERPs. What advice would you give to us to help dig them out of this hole?
Rand: Unfortunately, this is a case where the manual work of getting rid of the links and showing Google that good-faith effort is required. I’d suggest checking out Dr. Pete’s two excellent posts on this topic: https://www.seomoz.org/
Patrick: Do you think the update is completely rolled out? Do you think we will see version 2 etc like we did with Panda?
Rand: I very much hope there’s more to come with this update. I think Google was likely aggressive in some areas and not aggressive enough in others with Penguin. I don’t know if we’ll see another update soon, but I suspect there will be iterations, or at least, new releases that continue to focus on removing value from manipulative links.
Patrick: Lastly, a personal question if you don’t mind. You have the reputation of being a very “by the book” type of person when it comes to SEO. Is this the way you were brought up and is this the way that you are in real life?
Rand: Hard to say… In some ways, yes, in other ways, no. I definitely deviated from what my family wanted for me by not graduating from college. I went deeply into personal debt to fund a business that was going nowhere. I started a company with my Mom. We pivoted from consulting to product. None of those are conventional or widely accepted as good moves. In fact, I’d be hard pressed to argue that anyone should follow my crazy path over the last decade.
I’d also probably argue against being a “by the book” operator in the SEO field. The only grain of truth there is that I don’t believe it’s wise for businesses who want to build something lasting on the web to violate Google’s webspam guidelines. I’m hugely supportive of disruption, of creativity, of doing what others don’t or won’t. “By the book” SEO sounds to me like the approach taken by what some in the community used to call the “pointy white hats” of merely doing keyword research, creating accessible pages and letting the “marketing work” happen on its own. That’s definitely not me.
Patrick: Again, thank you so much for your time and most valuable input. Your company is a leader in the industry and companies like mine continue to look at yours for answers during these times.
Thanks so much Patrick! I hope my answers have been helpful and I look forward to more discussion around these topics in the field.
Please forward all comments to [email protected] or leave below.