__/ [ www.1-script.com ] on Monday 27 February 2006 03:13 \__
> Roy Schestowitz wrote:
>> Google will no longer view our sites as textual fragments, but rather
>> the pages and interpret them in a richer context (including JS and
> I doubt the part about rendering very much. It's the content they are
> after, not the presentation. Also, this article reads as if it was written
> couple years back. Googlebot is at version 2.1, not 2.0
> Also, Google's Deepbot has been signing as mozilla-compatible for couple
> years already, maybe as far back as 2003. Freshbot still signs as
> Googlebot, not Mozilla-compatible. Makes sense: it only looks for links
> and nothing else.
All are interesting observations as I don't have much confidence in the
article at hand. As regards Googlebot versions, I suppose it's possible that
a certain proportion of Google's hardware still runs
older-yet-fully-compatible software. I guess you know better though. I never
look at the raw logs and, if I ever do, I don't descend to lower levels of
I don't believe that Google cares much about presentation either. If they
did, on the other hand, there would be much to gain -- for all ends, i.e.
the surfers and the crawlers, sometimes even the Webmasters.