Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index


__/ [ John Bokma ] on Thursday 25 May 2006 17:59 \__

> Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> wrote:
>> __/ [ John Bokma ] on Thursday 25 May 2006 04:38 \__
> [..]
>>> They have an email address for the API, so I suggest to try that one.
>> I bet that many people use it for commercial purposes. Google cannot
>> truly check, but they do their best at reducing the load on their
>> server and the extent of SERP subversion.
> I have mailed them once or twice and did get a reply :-) It took quite
> some time though.

I am surprised. They are quite distant and isolated from the 'little people'.
You can't blame them as they get a lot of workload and people rarely quote
context when replying. *smile*

>>> Yes, true, different data center. I guess they even have an API
>>> specific dc.
>> I would not be surprised if it was slower, as well.
> Speed doesn't matter, the overhead of the request is probably the
> biggest problem :-).
> However, "recently" it's possible to get 502 Gateway error. So I have my
> new scripts retry a few times with a few seconds time out between each
> retry.

502? I never knew it was even defined.

>>>> 4) Just what's the deal with the Google API and legitimate SEO
>>>> software anyway?? Thanks!
>> Reducing loads. Serve searchers first, only *then* automated querying
>> software.
> Nah, wrong I think. Two reasons: first is it provides them data on how
> it is used, by whom, and what requests are populair. I think this data
> is worth a lot.

I think that standard queries can  more meaningful. They are used for Google
Trends, too.

> Second of all, a normal request uses up more bandwidth (guess, but I am
> quite sure about that).

Yes, but the normal user does not repeat things mechanically.

>> I wonder if Apache has a similar mechanism for querying
>> requests from bot when actual people request pages. Could be handy...
> Why? What's the point in delivering a page "slower" to Googlebot?

Humans are impatient, crawlers need not be impatient. Think about overloaded
shared servers.

> Moreover, delivering the page means Apache is occupied with that
> connection, and does exactly the opposite of what you have in mind.
> Giving the bot as fast as possible a reply is better.

Well, you can shuffle or re-priorities the stack, putting crawlers higher on
the stack.

> There is a compression mod that helps with this (the HTML page is send
> compressed).

True, but it is an optimisation that applies to most. You can't really use
that as a caveat, in my opinion.

> If you're really interested in giving bots the fastest answer possible,
> you might consider "cloaking", i.e. strip all that's not needed for the
> bot from your pages. And no, I doubt any SE is going to punish you for
> that.

...Until too many people do it for SEO purposes, or whichever misuse /de

Best wishes,


Roy S. Schestowitz      |    "These characters were randomly picked"
http://Schestowitz.com  | Free as in Free Beer ¦  PGP-Key: 0x74572E8E
  6:15pm  up 28 days  0:47,  8 users,  load average: 0.16, 0.30, 0.44
      http://iuron.com - semantic engine to gather information

  • Follow-Ups:
  • References:
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index