__/ [ Roy Schestowitz ] on Friday 05 May 2006 11:23 \__
> __/ [ Phil Payne ] on Friday 05 May 2006 11:17 \__
>> At around 09:00 UTC this morning.
>> First it tried to get 0T717Q3K81F45P9CHK78.htm - obviously a 404
>> functionality test.
>> Then it proceeded to download the site. Yes, all of it. We'll see
>> what turns up in the SERPs.
> How many pages in total? Googlebot never appears to do 404 tests. Neither
> do MSNBot, Yahoo/Inktom Slurp and other noticeable spiders (albeit Yahoo
> used to be so buggy, so it crawled incorrectly to request wrong files from
> the wrong sites). What I am trying to suggest that somebody may have
> forged user-agent. It's very simple to do this. It gives a cloak of
> stealth to someone wishing to rip off your site entirely, possibly using a
> grabber, e.g.
> wget -R --user-agent="Googlebot whatever..." your_site_URL
> Best wishes,
PS: get the IP address/es from the logs and run a reverse DNS lookup:
Was it truly Google? The last thing you want is someone mirroring your site
or using it as a starting point.