On Mon, Mar 10, 2008 at 8:47 AM, Kenneth Gonsalves lawgon@au-kbc.org wrote:
Both are related. Increase in business is an absolute goal for most. Being near the top in google results _gets_ you more business, anecdotally speaking. It's one of the aspects for most online businesses
so which comes first? the chicken or the egg? the aim of a search engine is to rank the sites according to some criteria - I assume usefulness and popularity would be the main criteria. So if your site is useful and popular, it will rank high - and you will get more business (but only if you remain useful and popular).
If only it were so simple. This is already a quite a bit OT, and going into details would be stretching it too far (and I have stretched a bit - see below)
But you can look for some guidelines from google itself: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=3576...
SEO has acquired a bad connotation due to unscrupulous elements, as can be seen from the tone of: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=3529...
But, the first link (above) is what 'good' SEO-ers will do. As a very simple example, if you have a page with many advertisements/partner links (needed for business), and useful content of your own, then one SEO guideline would be to use a (non-standard, but search-engine recognized?) "nofollow" attribute within the URL tags. Or, to have better sitemaps. Else, the number of outlinks on a page would drain your "rank" into those external ad/partner links, reducing your own estates' pageranks.
Another example: search for "subversion" on google.com. and see what additional information you see along with the first result. See the various categories "found" on subversion's website. These are the kinds of things "good" SEO should help you do. It helps both the business, and search engines. Search engines aren't _exactly_ as intelligent as we would want to believe! They put in place some logic to be "more useful" (this example), and then more people start making sure their pages are designed in a way google will "extract" structure from...
Best wishes, jaju
On 10-Mar-08, at 9:19 AM, Ravindra Jaju wrote:
But you can look for some guidelines from google itself: http://www.google.com/support/webmasters/bin/answer.py? hl=en&answer=35769
SEO has acquired a bad connotation due to unscrupulous elements, as can be seen from the tone of: http://www.google.com/support/webmasters/bin/answer.py? hl=en&answer=35291
But, the first link (above) is what 'good' SEO-ers will do
this is what good web programmers do - and most of them do it regardless of whether they are aware of search engines or even care about search engines.
On Mon, Mar 10, 2008 at 11:29 AM, Kenneth Gonsalves lawgon@au-kbc.org wrote:
this is what good web programmers do - and most of them do it regardless of whether they are aware of search engines or even care about search engines.
It's not just about web programmers. There are a variety of factors that SEO guys look at:
1) Ensuring that keywords relevant to your content are present. This cannot always be taken for granted. For example, when I was in a certain outsourcing firm, the biggest target keywords for our site were Outsourcing, Offshoring, etc. The content of the site was good but due to the lack of these keywords traffic wasn't as much. Incorporating those keywords in the content made a big difference 2) Javascript/DHTML usage makes a difference too. Spiders mostly don't follow javascript dropdown menus, combo box links, button links (submit ones), etc. You need to be aware of SEO to be able to decide that there should be at least one text link to every content page on your site. Of course, you may argue on the lines of accessibility as well on this one (lynx). 3) Code to content ratio. The less the amount of code w.r.t. content the better. Here, a good web programmer will do well without too much SEO help. 4) Links to your site make a very big difference. This is actually where SEO can make or break a site. Relevant partner links (someone linking to your blog post for example) increase your rank (so to speak) in a big way. Link spamming or link exchanges may lead to blacklisting.
Rest of the optimizations are statistical in nature, like spiders having a limited capacity, hence pages should not be too long. Similarly, spiders like shorter URLs, H1/bold tags, etc. Although much of these points are not officially noted anywhere, much of it makes sense in a natural content kind of a way (although I still can't make sense of the short URLs bit).
Much of it is common sense really, but one cannot always say that a good web programmer will take care of all of it.
Siddhesh