Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Unique subdomains running off common codebase not crawled by Google

  • Subject: Re: Unique subdomains running off common codebase not crawled by Google
  • From: Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx>
  • Date: Mon, 12 Jun 2006 09:55:46 +0100
  • Newsgroups: alt.internet.search-engines
  • Organization: schestowitz.com / MCC / Manchester University
  • References: <1150099071.323978.168010@y43g2000cwc.googlegroups.com>
  • Reply-to: newsgroups@xxxxxxxxxxxxxxx
  • User-agent: KNode/0.7.2
__/ [ admin@xxxxxxxxxxx ] on Monday 12 June 2006 08:57 \__

> I use a content management system (CMS) installed on my root domain. It
> serves up pages to my root domain and a subdomain. The content on each
> site is unique, apart from the shared user base. I intend to add
> additional subdomains in the near future. These would also have unique
> content and share the user base.
> 
> In order for the subdomains to use the same codebase a symlink is used
> i.e. sub1.domain.com is symbolically linked to domain.com. The
> subdomain shows the content as intended and "people" have no trouble
> accessing it.
> 
> The problem is that when Googlebot requests http://sub1.domain.com a
> HTTP status code 302 (moved temporarily) is returned. Googlebot stops
> dead at the redirect and won't parse the index page. Googlebot has
> however downloaded robots.txt and my sitemap from the root of
> sub1.domain.com.
> 
> I really don't want to install and maintain separate instances of the
> CMS for each subdomain.
> 
> Is there a way to share the codebase with my subdomains without using a
> symlink
> AND
> still serving up unique content to each site
> AND
> enabling Google to crawl the site?
> 
> Any ideas will be very much appreciated.

What CMS are you using? In the case of WordPress, for example, there are at
least two separate distributions (Lyceum and WordPress MU, derived from
Donnacha(xeer)'s work), which are built for exactly that type of
circumstance. They also can be used to share a common database for the
subdomains. I can't think of a way to get past the status reported to search
engines, so (partial) duplication of files might be necessary. Changing the
source code (if avilable in non-binary form) is the other possibility.

Best wishes,

Roy

-- 
Roy S. Schestowitz  
http://Schestowitz.com  |  GNU is Not UNIX  ¦     PGP-Key: 0x74572E8E
  9:50am  up 45 days 15:23,  10 users,  load average: 0.51, 0.89, 0.93
      http://iuron.com - proposing a non-profit search engine

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index