spiders and docs (was: [Seaside] REST and Seaside)

Cees de Groot cg at cdegroot.com
Tue Apr 12 15:49:01 CEST 2005


On Tue, 12 Apr 2005 00:48:21 +0200, Avi Bryant <avi.bryant at gmail.com>  
wrote:

> Ok, so clearly this is a problem we need to deal with (I've never seen
> it because all the apps I've deployed have login pages :).  Does
> anyone have any suggestions?

Seaside should catch requests for /?.*/robots.txt probably and return a  
file prohibiting robots from accessing any URL. That would stop 99.99% of  
the bots before they can do any damage.

Then invent some mechanism (Janus-like, maybe - it is quite a generic  
thing) to selectively open up parts of Seaside apps to bots. Or use 'my'  
HV+Seaside suggestion, forbidding bots to enter the Seaside part.


More information about the Seaside mailing list