[Seaside] Re: Static sites and spider handling

cg at cdegroot.com cg at cdegroot.com
Tue Aug 26 23:57:08 CEST 2003


Colin Putney  <seaside at lists.squeakfoundation.org> said:
>[...]. Both the dummy sites and 
>the real site would detect bots and dynamically generate really 
>optimized versions of their pages - high keyword densities, urls with 
>keywords in them etc.
>
This is of course decidedly difference from my proposal, to feed a 'bot
friendly' version of a site (rather then a 'bot deceiving' one). 

>We were advised not to do this by a search engine consultant, as she 
>had a few clients that were just about blacklisted by the search 
>engines. We ended up getting by on the strength of the domain name, the 
>fact that the site had been around forever, and all the genuinely 
>useful information we put on the site.
>
Yup. That (and Submitwolf, a search engine submission package) has
brought my private homepage a steady 500-700 visitors per day. 

-- 
Cees de Groot               http://www.cdegroot.com     <cg at cdegroot.com>
GnuPG 1024D/E0989E8B 0016 F679 F38D 5946 4ECD  1986 F303 937F E098 9E8B
Cogito ergo evigilo



More information about the Seaside mailing list