<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1">
<title></title>
</head>
<body text="#000000" bgcolor="#ffffff">
Cees de Groot wrote:<br>
<blockquote type="cite"
cite="mid1072425650.2006.517.camel@home.home.cdegroot.com">
<pre wrap="">On Thu, 2003-12-25 at 20:52, Nevin Pratt wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Could an RSS feed potentially be a solution for Seaside's unfriendliness towards spiders?
</pre>
</blockquote>
<pre wrap=""><!---->Janus is my shot at a solution. It seems to works fine, my Seaside site
has shot up in traffic and search engine presence over the last months
(from basically nothing to currently 160 visitors/day).
</pre>
</blockquote>
<br>
I still can't help but wonder if a simple solution isn't just staring
us in the face:<br>
<br>
Why not use the bot detection code of Janus, but then just not let the
session expire for the session(s) where a bot was detected? And
otherwise forget static page caches and all these other assorted
"dual-faced" schemes.<br>
<br>
I've seen the Google-bot follow the strange Seaside links, and even
index them. What it doesn't seem to like, though, is that on it's next
visit, the old links no longer work. So, if the session(s) that the
bots used never expired, wouldn't that solve the problem?<br>
<br>
Nevin<br>
<br>
<pre cols="72" class="moz-signature">--
Nevin Pratt
Bountiful Baby
<a class="moz-txt-link-freetext" href="http://www.bountifulbaby.com">http://www.bountifulbaby.com</a>
(801) 992-3137
</pre>
</body>
</html>