2 Forgotten SEO Skills: Prediction & Anticipation
SEO is tough. Or atleast, you think that SEO has become significantly tougher lately because your old school SEO tactics no longer work
. After the launch of Penguin 2.1
, especially, almost all sensible SEOs stopped using links built using automated processes and tools for the first tier for their serious websites.
And… this is a bit disappointing for people loyal to the black hat variant of SEO. With Matt Cutts determined to “break the spirits” of search spammers, this plainly looks like a bad time for black hat SEOs, who should be really careful with their strategies from now onwards.
Before you start to think… “ahh! here comes another crappy anti-blackhat post…”, let me declare that this isn’t going to be your average anti-blackhat post. I won’t even go on to tell you about the disadvantages of the dark side of SEO, someone from Matt’s team can explain that to you in a much better way.
What I’ll be doing instead is analyze the practical effectiveness of black hat SEO depending on the goals involved, and discuss about the advantages of using a prediction-based approach for your overall SEO campaign.
Black Hat SEO – It’s Still Effective
Yes. It’s still effective. I’m not sure about hidden text (punch me in the face if that still works in 2013), but when it comes to link building, various black hat automation tools still work – when in the right hands.
Google has made one thing very less effective for certain, and that’s noobs using black hat link building tools. In my opinion, the level of sophistication of your techniques the current Google spam-filtering algorithms demand, can be almost impossible to attain unless you have a real passion for tools, automation and technical things in general.
First, allow me to talk about a few areas where black hat SEO is still working like a charm:
- YouTube videos – working as well as ever, with spammed-to-death YouTube videos ranking in the first page of Google for some fairly competitive terms.
- Any sub-site on a high authority root domain, but not on a sub-domain. For example, Rebelmouse pages. Build any page, get an address like rebelmouse.com/page and then freely use black hat link building tools to make your way to the top of search results, with the domain authority and trust of the root domain becoming your friend.
- Various types of parasite pages – like pages on sites like Squidoo, Storify etc. They’re, as of now, more penalty-proof than an average .com domain.
And that’s just 1% of the game.
The truth is, a local drug store who just went online with hopes of increasing sales, can probably spend their money more efficiently if they just target either organic or paid search traffic instead of focusing on split-testing and advanced CRO (Conversion Rate Optimization) topics.
Similarly, a relatively larger brand relying entirely on the internet for their revenues shouldn’t put all of their eggs in one basket by focusing entirely on organic search, a particular referral, or social traffic. Traffic is good, but only till a certain extent, and it’s useless if your visitors don’t convert. That’s where CRO comes into play, and certainly should be a part of any decently large brand’s overall inbound marketing strategy.
Now, let’s talk about a different issue. Suppose, I’m part of an inter-student search ranking competition (how good would it be if it existed in real life) organized by my college. We all have a particular keyword to rank for and that’s totally unique, so the question of external competition doesn’t come into play. All the students are being given $100 to rank their webpage higher than others’. But the competition spans only two weeks. In that case, how would you spend the $100?
Would you think of content marketing? Or perhaps a strong social media campaign for your webpage? Truth be told, while you’ll be proud of your intelligence, thinking day and night of a unique content marketing campaign, some other student would just order some private blog network posts, a few social bookmarks, and a few social shares to make it look natural with half the money and spend the rest at the nearest KFC. Still, after the two weeks, he’d be unbeatable.
Now, I don’t know if this example of that hypothetical situation was lame, but my point is – execution should entirely depend on the goal(s), budget and various constraints involved.
A Prediction Based Approach to Your SEO Efforts
When you take a prediction based approach for your SEO campaigns, you’ll automatically be safe from the wrath of the big G in future. The predictions mainly work based on how Google would like to see things. If you were in Google’s position, would you be able to resist pushing the glowing ‘ask Matt Cutts to take care of spam’ button?
A few months ago, I wrote a post on my personal blog titled: “Anyone With a Working Brain Can Predict the Future of SEO”. I actually meant it. But the theory was actually based on a big assumption – Google will be able to implement whatever they wish to. What was the biggest problem to Google Search in 2009? It was way too much more dominated by spammy results as a result of black hat link building. Were they able to solve it? Living in the last few days of 2013, I have to admit they have been greatly successful at that.
Converting these predictions into actionable tips for yourself can get a bit tricky at times, but some things are easier to predict than the others, and they’re also easier to make use of, than the others.
Talk about high quality content. Google, as a company totally dependent on real people with real likeness searching for information online, are committed to provide users with what they actually want. So, if users are tired of low quality spun crap, it means Google would eventually focus on driving people to create better quality content, and ranking them better on the SERPs.
So, if tomorrow 90% of the world’s population goes crazy and wants nothing but funny cat pictures, I can’t see any reason why Google won’t advise more and more webmasters to post cat pictures on their sites.
Well, you get the point.
I think the future is in the actual usability and usefulness of websites. A great way to know what your target audience wants is to conduct a survey using something like Qualaroo or SurveyMonkey; or simply asking people individually.
If most prefer more video content, it’d be a wise decision for you to revamp your YouTube marketing skills. Similarly, if they want you to increase the amount of visual content in your site, by all means get some graphics designed for your site by hiring someone on oDesk or Elance; or get a graphics design contest up and running using something like 48Hours Logo and Dexigner.
A good thing about this prediction based approach is that it almost always suggests you to do something that indirectly benefits the end user. I mean, I’m sure (and these people are too) that website loading speed is a greatly over-hyped ranking factor. Whether it mattered or not, as soon as people heard it might have an impact on rankings, they went crazy about load times, often caring about load time differences of a few hundred milliseconds.
In the end…
It all comes down to your actual purpose. If you analyze your particular situation and find the white hat variant to be more beneficial, by all means go the white hat route, help develop the web as end users (and hence Google) wants it to be like. If you think the black hat methods will be more useful considering your purpose, by all means do whatever you possibly can to improve the organic search rankings of your website, there’s no one to actually stop you.