What kind of step are you willing to take for better rankings and more organic traffic?
For many years now, there has been an ongoing debate in the search engine optimization (SEO) world about whether “black-hat” or “gray-hat” tactics — that is, techniques that attempt to achieve quicker results by flouting the search engines’ guidelines — are acceptable.
While many commentators take a moralistic tone around this issue, I prefer to look at it in terms of risk. If you are willing to risk a Google penalty for the possible payoff of quicker or better rankings, then go for it! Just don’t be surprised when Google gets wise to what you’re doing and your traffic takes a nose dive! Doesn’t matter if it’s months or years later; expect to pay the piper.
Personally, as someone who works with a lot of large corporations with much at stake, I steer well clear of black-hat and gray-hat techniques.
For anyone working on a domain they don’t want to go down in flames, there’s simply no way to justify gambling with a site’s authority and reputation in such a reckless manner. In the SEO world, there are plenty of people willing to take the risk. Many SEO I know make the point that what is considered gray-hat and black-hat may be subjective, depending on the industry you are operating in.
While many SEO practitioners have years of experience in this field, Google algorithms get smarter by the day, and it’s becoming harder and harder for even the best SEOs to outsmart Google. That’s especially true for gray- or black-hat newbies. It’s fair to assume that many of Google algorithms learn and evolve autonomously. In such a world, gray-hat SEO techniques have become far riskier, not worth the effort.
So what are the risks and benefits (if any) of employing gray-hat and black-hat techniques in this day and age? For the record, Bing and Google are very, very clear on what goes against their policies. Here is a rundown of the most common gray/black-hat tactics and my insights on each.
Private blog networks
Private blog networks (PBNs) arose as a shortcut to building authority. The premise is simple: buy a bunch of expired domains with good domain authority and create a link back to your site.
Bingo! Instant backlinks and rankings, right?
Not really. These days, building a PBN takes a lot of effort and some sneaky tricks to avoid detection. For example, you’re going to want to look into the domain’s history and ensure it’s squeaky-clean, make sure it has never had a penalty or multiple owners. Sites that have been bought and sold many times are a red flag to Google. You’ll also want to make sure your sites are hosted by different companies and have different internet protocol (IP) addresses.
Pros: Using PBNs means you have full control over your link building and can save time and money on link outreach.
Cons: If just one of the sites in your network gets hit with a penalty, it can quickly be passed on to any site you’re linking to. You can torch your entire network with one slip-up.
My Take: I wouldn’t touch PBNs with a 10-foot pole. Period.
Spun, scraped or keyword-stuffed content
Creating good content is time-consuming and expensive, so it’s no surprise that people look for shortcuts.
While most duplicate content issues probably result in more from technical misconfigurations than intentional deception, there are still plenty of people out there who think they can game the system by scraping other people’s content or “spinning articles” to create dozens of variations on the same article, oftentimes injecting extra keywords for good measure (i.e., keyword stuffing).
For those of us who actually prefer informative, readable content, it’s a blessing that these spammy content tricks don’t work anymore.
Google’s algorithms now have a much more sophisticated grasp on grammar and natural language. As such, Google can spot these tricks a mile away, so expect to get slapped with a penalty if you try them. My advice: Just write something people will want to read.
Pros: Quickly and easily create new content. At scale!
Cons: Destroy credibility with your users and search engines as soon as you publish.
My Take: Do you seriously have to ask?
It’s a cutthroat world out there, and sometimes people sabotage their competitors’ websites.
This is referred to as “negative SEO” and generally involves pointing spammy links at a site, buying links on behalf of the competitor, scraping their content and duplicating it across multiple sites, trying to crash the site by driving too much bot traffic to it (i.e., DDOS attack), or even hacking into a site to insert malware or modify the content.
Not only are some of these tactics, like hacking, illegal, but Google is also getting better at detecting and ignoring things like spam links. In short, negative SEO is a huge risk, and there’s no guarantee it will even work.
One particularly nefarious negative SEO tactic is to pose as the competitor and launch a link removal campaign. That’s right, some link removal requests you receive as a webmaster is from forged senders.
Pros: None that are worth having.
Cons: You could wind up banned from the search results or in jail.
My Take: JUST DON’T. You will sleep better at night.
Soliciting links from other websites to improve your ranking or boost traffic to your web pages is a vital part of SEO. It’s also time-consuming, frustrating and boring. But if you think buying links is the answer, think again. It’s against Google and Bing’s guidelines, so if either engine catches you, you can wind up with a penalty and have your rankings wiped out.
In addition, buying links is quite expensive. One Ahrefs study found that the average cost of buying a link is between $350 and $600. If you spent that money on a legit content marketing campaign, there’s no reason why you couldn’t achieve multiple links of higher quality for the same price.
Pros: Easier than traditional link building.
Cons: You’ll achieve a better outcome using white-hat link-building techniques, and you’ll probably spend less in the long run.
My Take: Buying links leave an obvious footprint, and you’ll regret it when you have to launch a link-removal campaign to undo everything you built and then implement a slew of reconsideration requests.
Cloaking refers to the practice of creating one type of content to display to a search engine spider (say, a page full of keyword-rich copy) while showing another type of content to the user (for example, an image-heavy page with sales copy).
As you might have guessed, sites that cloak pages are generally trying to hide something from search engine spiders while attempting to manipulate their Google rankings.
Cloaking is a high-risk technique that will get you penalized or even blacklisted by most search engines. On top of that, it’s easy for Google to catch you simply by using unpredictable IPs and user agents. Search engines analyze a number of elements on your site in order to determine what you should rank for, not just the content.
If you’re cloaking your content, it won’t take long for Google to figure it out.
Cons: Your site will be penalized or even blacklisted by Google.
My Take: Danger, Will Robinson!
Using gray- and black-hat techniques requires a lot of sneaking around to obscure your online footprint and avoid being penalized. This seems fairly basic, but you’d be surprised at the people who disagree.
Some SEOs feel it’s up to you to decide if the additional effort and risk is worth it. If yes, they feel your goal should be to make your site’s “story” believable to the search engines and use gray-hat techniques to kick-start your SEO, then transition to more conventional white-hat techniques to gradually eliminate the risk over time.
My position on all this? Sticking to white-hat techniques from the beginning is actually less effort and involves less risk, with higher returns in the long run. As a bonus, you won’t be losing sleep at night worrying about the day that Google will finally come knocking on your door.
Fly well, my friends!
For webmasters affected by a manual action, it’s important to understand why a particular penalty is applied, the consequences and how to adequately address key issues. Check out our Google Manual Actions: Frequently asked questions and their answers for help and insights.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
The Author Stephan Spencer is the creator of the 3-day immersive SEO seminar Traffic Control; an author of the O’Reilly books The Art of SEO, Google Power Search, and Social eCommerce; founder of the SEO agency Netconcepts (acquired in 2010); inventor of the SEO proxy technology GravityStream; and the host of two podcast shows The Optimized Geek and Marketing Speak.