Fletcher Ward Design - 020 7637 0940


Visit our website at: www.fletcherwarddesign.co.uk



Thursday, July 23, 2009

SEO

Does anybody out there agree with us that the system that the web crawlers apparently use to decide on a given site's importance, and therefore its ranking, seems somewhat invalid?
Apart from the site itself and the way it's constructed, the most important thing, we're told by Google, is the number of backward links to the site.
This means that there are lots of companies out there employing cheap labour to build links - sometimes spurious - to their sites to get the number up. There are, at least from what I read online, lots of companies outsourcing to emerging nation firms to establish links.
How does this contribute to the genuine validity and authority of a site?
Of course there are a number of other - genuine - contributing factors to moving your site up the ratings and these at least seem to make more sense but to rate a site by the number of backward links seems somewhat unfair and old-fashioned.
And almost every company that approaches clients says that it can get them on to the first page of the search engines but how can this be? You only have to think about it for a moment or two to see that this is impossible for most normal sites. You can almost always get yourself to the top of a search list if you type in a very specific phrase that is included on your site but if somebody searching for a general product or activity types that into a search engine window then the chances of you being at the top of the list is extremely remote.
How do you feel about this situation? It would be interesting to at least know that we're not alone!

No comments:

Post a Comment