<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
<title></title>
</head>
<body bgcolor="#ffffff" text="#000000">
<br>
<blockquote cite="midad69ab6905040914081befc120@mail.gmail.com"
type="cite">
<pre wrap="">
Ok, 1500 concurrent sessions is a very big number; that's pretty
consistent with your image sizes, so I agree it probably doesn't have
anything to do with GLORP etc. But even with a 100 minute time out
that sounds awfully high; do you really think you get, say, 10000
visitors per day? Or is something else going on? Either the peaks
are very, very heavy, or there's something wrong with the expiry.
</pre>
</blockquote>
<br>
I don't know. I know that most of the days orders come in between 8:00
AM and 1:00 PM. So they do tend to cluster up around a particular time
span.<br>
<blockquote cite="midad69ab6905040914081befc120@mail.gmail.com"
type="cite">
<pre wrap="">
One instrumentation that would help here would be to simply log every
time a session is created. That should barely affect performance and
would be a good indication of what's going on.
</pre>
<blockquote type="cite">
<pre wrap=""> I might have gotten hit with a DOS attack or something on Thursday and
Friday, I don't know. The traffic numbers certainly would seem to support
that possibility.
</pre>
</blockquote>
<pre wrap=""><!---->
Yes. I wonder what strategies we can use to detect and cope with
that. One I can think of is to link the expiry time to how much the
application has been used: if all you do is request the homepage, your
session will expire very quickly, but if you look around a little more
you're given more time. That seem reasonable?
</pre>
</blockquote>
<br>
I think that's a great idea.<br>
<br>
Nevin<br>
<br>
</body>
</html>