Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: (News) Googlebot Stops Lynxing, Starts Using Mozilla

"Roy Schestowitz" <newsgroups@xxxxxxxxxxxxxxx> wrote in message 

> I am flattered that you consider me "programming-savvy", but I am not
> proficient or well-oriented in this domain which is browser renderers. I
> know a fair bit about KHTML and Gecko, yet I have never used Lynx, only 
> seen
> screenshots of it.
> If Google decided to render pages graphically, which requires some extra
> computational labour, they could initially choose to do it with just a
> subsample of popular pages. Alternatively, they could just employ Lynx and
> subsequently Mozilla /only/ if the scanned page has changed based on 
> shallow
> analysis by Lynx.
> So what could they do with the product (assuming that John Bokma's 
> statement
> is inaccurate and thing work as the blog suggests)? With tools that are
> equivalent to http://khtml2png.sourceforge.net/ , they could make use of
> pattern recognition and image analysis programs (also see
> http://browsershots.org/). Interpretation of images happens to be my field
> of research actually. All sorts of tests could then be run and lead to a
> reward or penalty. Running a series of such test, gives a figure of merit,
> which may depend on the surfer in question (use parameters like browser,
> O/S, screen size, known accessibility issues, etc.). Tests I can think of
> are:
> * Does the page get rendered gracefully in all browser rendering engines?
> * Does the page redirect using JavaScript (like richo.de and bmw.de)?
> * How 'pretty' is the page? Cosmetics are a matter of taste and a 'fluffy'
> notion, so I doubt it's of any fair or valued use.
> * How heavy is the page (including images) and does the size justify the
> visual gains and enhancements? Is there a 'nice' combination of colours 
> used
> (discerns pricy design work from amateurish DIY)
> * Is anything pornographic possibly contained in the page (use a *rough*
> scoring mechanism)?
> * What screen sizes are properly supported? Should the search engines 
> deliver
> different results depending on the perceived support for screen sizes and
> other pre-requisites like JavaScript? Could PDA users get different 
> results
> pages altogether?
> * Arguments similar to the above but in reference to the visually 
> impaired,
> the astigmatic. This can currently be done by looking at colour contrast 
> (in
> the CSS/Source) and font sizes. What about a special option for the
> colour-blind, e.g. "give me no pages with yellow and green on the same
> page"?
> * Okay, enough for now... *smile*
> To Webmasters, this also means that more bandwidth will be consumed by the
> crawler in question, if it truly exists or ever *will* exist.
> With kind regards,
> Roy
> -- 
> Roy S. Schestowitz      |    Y |-(1^2)|^(1/2)+1 K
> http://Schestowitz.com  |    SuSE Linux     |     PGP-Key: 0x74572E8E
>  6:25pm  up 6 days  6:44,  8 users,  load average: 0.36, 0.89, 0.91
>      http://iuron.com - next generation of search paradigms

Wow, what a great answer.

I get it, I think. The search engine could deliver results based on the 
visitors browsing preferences, physical and hardware limitations. Site 
owners and developers would be required to adhere to stricter design 
standards in order to deliver web content that satisfies a broader range of 
browser criteria. Thus would be the need for the crawler to analyze the .js 
and .css in order to create a "snapshot". This "snapshot" could be used to 
determine compatibility with a searcher's limitations both physical and 
electronic. The search engine could then deliver results based on what it 
has perceived to be acceptable web sites that both are relevant to the 
search query and that satisfy the searchers additional requirements.


Fred canadian_web@xxxxxxxxxxx
Ethical SEO Tips, Tools and Resources

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index