Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: PHPSESSID , googlebot and robot.txt

__/ [ madsgormlarsen@xxxxxxxxx ] on Saturday 01 April 2006 18:09 \__

>> If one session ID will suffice, have a look at Google Sitemaps and use XML
>> to reduce the number of URL's you want indexed.
> I already use a sitemap, but I guess Google still follows links not in
> the sitemap.
> There are symbols
>> which are treated as irrelevant arguments and dropped when appended to the
>> URL's.
> Perhaps I could ad a id that would make google ignore the session ID.
> Here is an exsample
> www.winches.dk/servicekits.php?PHPSESSID=ace8751e1a1910b6fc3471d601909264
> Do any of you now whether this influences pagerank?
> Thanks a lot for the help.
> Mads

Hi Mads,

I'll answer in a bullted form if you don't mind:

* Page index blocking can be achieved by:

  - robots.txt exclusions
  - metadata as seen in:http://www.i18nguy.com/markup/metatags.html
    (might be hard to change meta generation, i.e. decision in dynamic pages)
  - diversion through sitemaps, which you suggest won't work.
  - session id's that comply with guidelines to Webmasters (not just

* Impact of having many pages indexed

  -users reach uninteresting pages
  -SE algorithm apply automated penalty for high rise in # of pages.
   This reduced ranks and affects referrals
  -PageRank (and the equivalent likes of it) is being spread among
   many pages. Therefore, each pages loses value.

Best wishes and good luck,


Roy S. Schestowitz      |    "I regularly SSH to God's brain and reboot"
http://Schestowitz.com  |    SuSE Linux     ¦     PGP-Key: 0x74572E8E
  6:15pm  up 24 days  7:58,  10 users,  load average: 1.91, 0.95, 0.73
      http://iuron.com - help build a non-profit search engine

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index