There are literally thousands of factors that influence the natural listings, and it goes without saying that not all will have a positive effect on your natural listing results. I thought it would be a good idea to identify the top 5 mistakes that as an SEO programmer are all too frequent, and that any site owner or developer should stay clear of.
Spamming – spam is a word used excessively when talking about modern internet usage, it basically means deceiving the spiders that read your code in terms of search engine optimisation. It’s most common form is same text repeatedly used as anchors or bold, some people believe that by cramming your keyword in 50 times Google will pick up its importance and rank you higher, this is true to en extent but the number of occurrences needs to be an optimum level in relation to how many words you have on a page.
Duplicate content – Text that has been used anywhere else on the World Wide Web is classed as duplicate content, a site that this is found on is essentially discredited of any importance by the major engines. Although it is time consuming and boring, I cannot overstate the importance of writing fresh content about your main keywords on multiple pages on your site.
Low quality and reciprocal linking – Linking is high on the agenda for search engine ranking, therefore good quality inbound links will take you to the top, and low quality will have a negative effect. Links from pages that are in bad neighbourhoods, irrelevant to your subject, or on a free for all directories are totally worthless, similarly reciprocal or link exchanges are no longer worthwhile.
Hidden text – This is absolutely criminal in accordance with web programming standards, as soon as Google discovers a site to have same colour text as background, sirens will ring and your site will get banned from the listings, no matter how clever you think it is, don’t do it!
We would love to hear anyone else’s views on poor SEO tactics, or alternatively what you have seen to be successful in your experience.