‘Black-Hat’ SEO Techniques

The following SEO techniques are known as ‘black-hat’ SEO and should be avoided by your web site. These techniques can sometimes provide very quick search engine result improvements, but over time are known to cause sites being banned and removed from indexes altogether.

Hidden Text
An old technique to increase keywords on a page was to include long lists of keywords and key phrases as hidden text on a page. This was sometimes achieved by placing the text far below the main content on the page or by displaying the text as the same colour as a background colour, i.e. white on white or black on black. This technique goes against Google’s webmaster guidelines.

Cloaking or Doorway Pages
Don’t deceive your users or present different content to search engines than you display to users. Matt Cutts (Google employee) describes a classic case of this on his blog site (http://www.mattcutts.com/blog/ramping-up-on-international-webspam/) whereby BMW displayed text to search engine robots whereas normal web users would be shown other content via the use of a JavaScript redirect. It is important to not try to do quick redirects on pages so that the search engine crawlers see certain content and then a user’s browser redirects to another page

Link schemes
Avoid links schemes that provide a massive increase in incoming links from bad neighbourhoods and other sites of dubious content. Participating in such schemes can result in your site being penalised.

Automated Search Engine Submission Software
Avoid using search engine submission software such as ‘WebPosition Gold” or other similar products. As long as your site is linked to from other sites and your site is up and running with a valid robots.txt file then pages and content will be indexed without the need for this software.

Duplicate Content
Avoid duplicating the same content on different pages. If Google detects large amounts of duplicated pages on different sub or main domains then you can risk a ‘duplicate content penalty’ which can result in the site losing rankings. Sub domains such as above are all treated as separate websites and if duplicate content is found then both sites can suffer ranking problems until the original origin site of the content is determined by Google’s algorithms.

Content Solely for Search Engines
Avoid publishing any content that is solely for search engine spiders. Content that is too rich in keyword density and unintelligible to a human can be detected by Google and other search engines and result in site penalties. Always write content firstly for humans and secondly for crawlers and robots. As long as the pages are proper English there is no problem with including keywords.

Although not a ‘Black Hat’ SEO, Frames should be avoided on sites as they cause huge problems with crawlers indexing sites. Individual pages can only be indexed and therefore that means each page in a frame will be indexed which will cause problems when someone clicks on that individual page from search engine results.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.