__/ [Samir] on Wednesday 08 February 2006 10:59 \__
> They are different sites on different servers. The only logical group
> for these sites are that they come from a particulal geographical
> location... otherwise the content, ownership, etc is unique to each
You could automate a batch of search engine queries to analyse some of the
above /en masse/ and reduce the data fetched to produce concise reports. Use
a grabber. For validity and the like you might have to take advantage of
other service like the W3 service (or commerical alternatives).
Roy S. Schestowitz | "This sig seemed like a good idea at the time..."
http://Schestowitz.com | SuSE Linux | PGP-Key: 0x74572E8E
2:15pm up 22 days 9:31, 12 users, load average: 0.52, 0.44, 0.34
http://iuron.com - help build a non-profit search engine