From kake@earth.li Wed Jun 25 15:08:07 2008 From: Kake L Pugh To: openguides-dev@lists.openguides.org Subject: [OGDev] Robot deterrence Date: Wed, 25 Jun 2008 15:07:59 +0100 Message-ID: <20080625140759.GA478@the.earth.li> MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="===============5527850264052663154==" --===============5527850264052663154== Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable There are a number of OpenGuides page types that web spiders don't really need to index, and we have code to stop them doing it, e.g. http://dev.openguides.org/changeset/573 http://dev.openguides.org/changeset/1132 However, it doesn't seem to be working. See for instance: http://london.randomness.org.uk/wiki.cgi?action=3Dlist_all_versions;id=3DLo= cale%20IG9 which if you view the source does indeed have in the . But from the Apache logs: 66.249.67.153 - - [25/Jun/2008:14:59:00 +0100] "GET /wiki.cgi?action=3Dlist= _all_versions;id=3DLocale%20IG9 HTTP/1.1" 200 3151 "-" "Mozilla/5.0 (compatib= le; Googlebot/2.1; +http://www.google.com/bot.html)" Am I missing something obvious? Kake --===============5527850264052663154==--