Business best practices can change quickly, but perhaps none quite as fast as Search Engine Optimization. The purpose of SEO is to do what you can to influence the rankings of a search engine, like Google or Bing, so that your web pages come up higher in search results for queries relevant to your business. This can be extremely beneficial, as it can bring more traffic to your website, boosting your conversions and, ultimately bringing in more revenue.
As with any practice that can generate income for your startup, there will always be those who try to exploit it. As a result, Google and other search engines are constantly improving and adjusting their algorithm to close any exploits and to serve up the best quality results pages possible.
These changes have left many websites in no-man’s land, having used some questionable tactics to get there, which the algorithm no longer likes. The ones listed below might have worked once, but will no longer serve your startup in present day.
Links from social bookmarking
Social bookmarking was once highly recommended because it got results, plain and simple. But Google and other search engines know that these links don’t mean anything because anybody can submit their website to these lists, and therefore they adjust their algorithm to nullify the benefit.
While some might argue that there is no harm in using a tactic like this, many would disagree and see such links as spam. You never want to associate your website with spam. You should keep as far away from social bookmarking as possible, and from almost all backlinks that can be gained through submissions only.
Related: How to Drive Users to Your Startup’s Website with an SEO Strategy
Search engine submissions
Search engine submission services rely on misdirecting newcomers by telling them that they’ll never appear in Google or Bing if they don’t pay the search engine to list their site. Search engine submissions are another bad SEO practice because they’re a complete waste of time.
As with many of these out-of-date practices, there is some truth to the idea that Google may have once needed your help to identify new sites. However, that time is gone, and it has been for at least a decade.
Google finds new sites very quickly, especially when your website is being linked to from other reputable websites. Rather than wasting money on these services, you’d see better results from hiring a quality SEO agency or social media team.
On-page keyword ratios/densities
For years, many SEO gurus spread the myth of a precise on-page keyword density far after it stopped being used as a ranking signal by Google. A keyword is a phrase that people might search for in Google and for which you would like to rank for. A great example of this would be the phrase “mechanic NYC” for a local car mechanic in New York City.
Keyword density refers to the number of times the keyword is used in the article, expressed as a percentage of the number of words on the page. Google and other search engines used keyword density as an indication of how relevant a page was to the search phrase, and therefore in primitive iterations of the Google algorithm, you could rank higher by using keywords more frequently.
However, it’s been common knowledge for years now that Google and Bing use a method called TF*IDF, rather than keyword density, to calculate the importance of terms on a page.
While keyword density takes the number of occurrences of a keyword on a page and expresses it as a percentage of the total word count, TF*IDF is far more effective. TD*IDF stands for Term Frequency * Inverse Document Frequency, which produces a composite weight for each term in each document.
TF*IDF is far more effective because it recognizes that a term which is used less frequently is more important each time it’s used, while a word like “and” has practically no value because it’s used so often.
We know that Google has been using TF*IDF for years, so if you’re worried about the number of times you use a keyword, then you should refer to TF*IDF, not keyword density or ratios.
Are you a little bit confused?
Don’t worry. The reason why Google changed to using TF*IDF and other complicated equations is because it doesn’t want people to manipulate its algorithm. Google and other search engines want you to write your articles in a natural manner, so many would argue that the best way to optimize for keyword density and TF*IDF is to not worry about it at all.
Meta keywords
On a webpage, you can include a tag for meta keywords, which are the keywords you think a webpage should rank for. This is incredibly easy to manipulate and therefore has had zero influence on your rankings for over a decade.
If anything, it’s possible that Google or other search engines look at this as a sign of a low-quality website because only spammers try to use meta keywords in this day and age.
Press releases
Although there are some very valid reasons for issuing a press release, proper press releases are incredibly expensive and challenging to pull off. Most of the time, if you pay for a press release service, you’re copying poor quality content around the web on spam websites. Google pays no attention to those links and could easily count them against you if they think that you’re actively trying to manipulate their rankings.
Sign Up: Receive the StartupNation newsletter!
Over-optimized anchor text
When one website links to another, it generally uses an anchor text, which is an underlined, hyperlinked text string. Search engines use this anchor text to inform their algorithm about what they should expect to find on the page when they follow the link. A link anchored with the text “best car” should link to a page telling you about the best cars, which is why this anchor text is used as a ranking signal for the page being linked to.
As you might imagine, this is relatively easy to manipulate, especially if you’re able to control the links that you’re getting through guest posts, social bookmarking and submissions.
To improve its algorithm, Google released an update, which became known as Penguin. This update targeted a variety of factors which spammers and black hat SEOs were using to rank low-quality pages, but one of the biggest changes was to the importance of anchor texts to the algorithm.
Most noticeably, you could no longer use large amounts of exact match anchor text, which is when you link to your pages using the keywords that you wanted to rank for. There once was a time where you could dial your exact match anchor text up to around 40 percent, but today creeping over 10 percent on a large enough scale is likely to see your website slapped with an algorithmic penalty. Instead of risking a penalty, earn and build links with natural anchor texts, and you’ll never worry about an update like Penguin ever again.
Web 2.0s
Last but not least, we need to talk about a tactic that is still being used by unscrupulous SEO agencies across the globe. Web 2.0s like Wix, Weebly, Blogspot and any other website maker which is hosted on a subdomain, have been used as low-quality Private Blog Networks for years.
If in doubt, there’s a very simple rule that you can follow to find out if a link is valuable: if you built a link that you’re reasonably confident nobody will ever see and click, it’s probably either of no value or potentially harmful.
By deciding to focus on proper marketing tactics and non-manipulative on-page SEO, you’ll prevent yourself from having to read an article like this 10 years from now that laughs at the tactics that you used.