Jay reviewed his activities of several weeks ago
when he used statistics as a guide to making site ranking
improvements to one of his sites (http://www.residential-landscape-lighting-design.com/).
Within a very short period of time, his
modifications resulted in top first page rankings for all of the
terms he focused on. He stressed the point that keeping a site
ranking hig is an ongoing process that takes persistent effort and
attention. Pages and performance have to be continually reviewed and
modified based on changing search engine practices and standards.
One example he cited is the changing view of
the value of keyword metatags. Search engines do not consider these
tags very important (Google does not even read them!), but they can
be useful for "low end terms" and for getting ranking on
misspellings that you do not want your site visitors to see.
Misspellings are common on the Internet and can generate valuable
traffic, but using them in your content does not enhance the
professionalism of your site.
Previous practice in the keyword tag has been
to repeat important keywords frequently, but the engines are now
taking a different view of this practice and can actually penalize
your site for "stuffing" this tag excessively.
He spent some time reviewing a newly
discovered keyword density analyzer (http://developers.evrsoft.com/seotool/)
and pointed out a new standard for Google of 2% (preferred ratio of
keyword phrases to total word count). This tool has a number of
features that simplify the process of finding the best keyword
phrases for your site.
Another change in the works is that Google
appears to be looking at outbound links somewhat differently and may
enhance your site rankings based on the quality of the sites that
you link outward to.
{Jay's
notes in Word}
{Jay's notes in RTF} |