On Mon, Mar 10, 2008 at 11:29 AM, Kenneth Gonsalves lawgon@au-kbc.org wrote:
this is what good web programmers do - and most of them do it regardless of whether they are aware of search engines or even care about search engines.
It's not just about web programmers. There are a variety of factors that SEO guys look at:
1) Ensuring that keywords relevant to your content are present. This cannot always be taken for granted. For example, when I was in a certain outsourcing firm, the biggest target keywords for our site were Outsourcing, Offshoring, etc. The content of the site was good but due to the lack of these keywords traffic wasn't as much. Incorporating those keywords in the content made a big difference 2) Javascript/DHTML usage makes a difference too. Spiders mostly don't follow javascript dropdown menus, combo box links, button links (submit ones), etc. You need to be aware of SEO to be able to decide that there should be at least one text link to every content page on your site. Of course, you may argue on the lines of accessibility as well on this one (lynx). 3) Code to content ratio. The less the amount of code w.r.t. content the better. Here, a good web programmer will do well without too much SEO help. 4) Links to your site make a very big difference. This is actually where SEO can make or break a site. Relevant partner links (someone linking to your blog post for example) increase your rank (so to speak) in a big way. Link spamming or link exchanges may lead to blacklisting.
Rest of the optimizations are statistical in nature, like spiders having a limited capacity, hence pages should not be too long. Similarly, spiders like shorter URLs, H1/bold tags, etc. Although much of these points are not officially noted anywhere, much of it makes sense in a natural content kind of a way (although I still can't make sense of the short URLs bit).
Much of it is common sense really, but one cannot always say that a good web programmer will take care of all of it.
Siddhesh