Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Google Sitemaps - anyone getting delays?

  • Subject: Re: Google Sitemaps - anyone getting delays?
  • From: Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx>
  • Date: Wed, 06 Sep 2006 12:38:36 +0100
  • Newsgroups: alt.internet.search-engines
  • Organization: schestowitz.com / ISBE, Manchester University / ITS / Netscape / MCC
  • References: <1157465134.296116.88970@m73g2000cwd.googlegroups.com> <12frpgd49iacrb3@corp.supernews.com>
  • Reply-to: newsgroups@xxxxxxxxxxxxxxx
  • User-agent: KNode/0.7.2
__/ [ z ] on Tuesday 05 September 2006 22:03 \__

> Paul Silver wrote:
> 
>> Hi, one of my clients has set up a Google Sitemap (list of links
>> version rather than XML) and submitted it in the second week of August.
>> Google isn't showing all the pages that are in the Sitemap yet, but are
>> showing some old pages which have been removed by the site owners.
>> 
>> As far as I can tell, the maps file is to spec (it's difficult to get a
>> text file of links wrong) and they've submitted it correctly. Their
>> site is relatively new, about 6-8 months old, and has just over 200
>> links to it according to Yahoo, with few or no reciprocals and no links
>> out to dodgy sites.
>> 
>> So... is anyone else getting this sort of delay when submitting site
>> updates through Google Sitemaps? Or any advice about getting Google to
>> update quickly?
> 
> My take on Google sitemaps is that they were not created for the benefit of
> webmasters.  In my viewpoint, the sitemaps program appears to be created
> primarily for Google's long-term debugging of Googlebot(s).  Everything
> else is a smokescreen, although the Sitemap control panel can provide
> useful information that is not necessarily dependent on having a sitemap at
> all.


Aye. I never fancied Google Sitemaps (for a variety of reasons actually
http://tinyurl.com/htre4 ). Crawling of the site in a page-by-page fashion
provided good and fast coverage. XML may be good for prioritising pages, but
depth of pages implicitly achieves this. Moreover, XML seems like a matter
of convenience to Google. Rather than crawl sites to study their structures
(and most possibly create a crawling schedule), Google wants the Webmasters
to do all the work off-line, i.e. on the filesystem, which saves bandwidth.
Not only does it lock the competitors out (unless they catch up with
non-standard 'extensions') but it also delegates work to others, which
reduces Google's workload.


> Don't expect Google to follow your instructions in the sitemap -- use
> standard SEO to make sure Google can spider your pages, and to make sure
> that Google thinks those pages are important (i.e., IBLs -- in this case
> maybe a couple of deeplinks).  Then wait...
> 
> If the site is not "important" (new web site and/or not a lot of IBLs) then
> it might take more time.
> 
> Google will remove the old pages eventually if the site is sending
> correct "404 Not Found" headers.  You can use the livehttpheaders Firefox
> extension to check.  If the site is sending 404 or 301 headers for pages
> that have been removed, then just wait...


I can't recall seeing you in AISE before, but you have some good insights to
share. *smile*

Best wishes,

Roy

-- 
foo bar!
http://Schestowitz.com  | Free as in Free Beer ¦  PGP-Key: 0x74572E8E
Load average (/proc/loadavg): 0.42 0.54 0.53 2/133 4927
      http://iuron.com - semantic search engine project initiative

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index