[Seaside] Static sites and spider handling

Avi Bryant avi at beta4.com
Tue Aug 26 14:40:05 CEST 2003


On Tue, 26 Aug 2003, Colin Putney wrote:

> We were advised not to do this by a search engine consultant, as she
> had a few clients that were just about blacklisted by the search
> engines. We ended up getting by on the strength of the domain name, the
> fact that the site had been around forever, and all the genuinely
> useful information we put on the site.

I did a little research just now; my impression is that this kind of
blacklisting is not done automatically, but based on human observation.
Bear in mind that what we're talking about does not present the search
engine with actual content that's any different from what a human would
see - we're simply tweaking the *structure* of the site to make it more
robot friendly.  Unlike most "cloaked" sites, for example, we wouldn't
want to disable the google cache using this approach (it might be a little
odd that the cached page had no links on it, but the content and layout
would all be normal).

I would be surprised if this ended up genuinely annoying Google, unless
they're trying to make some kind of a political point about cloaking.  If
a human did end up auditing the site, it ought to be pretty obvious that
the aim is to help search engines, not to trick them.



More information about the Seaside mailing list