[Seaside] Apache frontend for Squeak, mod_scgi ?

Colin Putney cputney at wiresong.ca
Fri Sep 26 15:44:52 CEST 2003


On Friday, September 26, 2003, at 01:23 PM, Jimmie Houchin wrote:

> Avi Bryant wrote:
>
>> Now, the last format (that mod_proxy uses) is straight HTTP, which 
>> means
>> that Comanche will understand it.  The others need a different kind of
>> server which understands the mod_lisp format or the mod_scgi format
>> instead of the HTTP format.  There's no particular reason to believe 
>> that
>> these servers will be any faster than Comanche, although the 
>> particular
>> implementations may be.  I don't think request parsing is much of a
>> bottleneck anyway, it seems a funny thing to optimize.
>> *All* of these will be slower than connecting to Comanche directly.  
>> The
>> point of putting Squeak behind apache is not to somehow leverage 
>> apache's
>> performance, but to integrate better with other apache features - 
>> like,
>> for example, serving static content.
>
> This is the part I don't understand. I am not expert on any of this.
> All I know is that:
>
> Apache helloworld25k.html = 800 rps
> Comanche helloworld25k.html = 90 rps
> Apache, mod_python helloworld25k.py = 390 rps
> 	(script opening the 25k file and serving it to Apache)
> Medusa (python web server)helloworld25k.html = 25 rps
> This is all from memory and not necessarily totally accurate.

Ah, but these are all tests of performance at serving static pages. Of 
course Apache will be king here, it's straight C-code, highly tuned for 
doing exactly that. As soon as you introduce dynamic content, it's 
whole different ball of wax.

> I don't intend for Apache to serve any static files. I'll use Tux for 
> static files and images. My only desire for Apache is to improve 
> dynamic requests. If Squeak/Comanche could equal or better 
> Apache/mod_python in performance on dynamic pages. I would leave 
> Apache alone.

In this case, as Avi mentioned, putting Apache in front of Squeak will 
only slow the whole process down. Assuming, of course, that you only 
have one Squeak server. If you've got a cluster of Squeak servers and 
you're using Apache for load balancing, you could probably get good 
performance under high load, although the time it takes to process a 
given request wouldn't improve.

> If I am misunderstanding my experience or doing something wrong I 
> don't know. I am open too that. I don't think Apache mod_python are 
> doing any caching. The script opens, reads and closes the file. Python 
> is persistent, long-lived, but the script should still execute the 
> open, read, close every time. Just a modest attempt at a simple 
> dynamic response.

I think the reason your mod_python test is so much faster than Comanche 
or Medusa is because it spends most of it's time in C code. I'll bet 
the python code to open and output a file is only a couple of lines, am 
I right?

The difference between this and what you'll get with Quixote probably 
quite significant. With Quixote, you'll incur the overhead of 
interprocess communication between Apache and your python process. With 
mod_python, you don't get this overhead, because python is running 
inside your Apache process.

I would expect Apache + mod_scgi + python server to be slower than 
straight Commanche.

I suspect that you're attempting to optimize prematurely. Ultimately 
the performance of your site will probably depend more on how you 
generate your dynamic content than on what language or HTTP server you 
use. If your database has 10s of millions of records, you're app will 
probably spend more time waiting for data than parsing requests.

My best advice would be to worry more about writing your app than 
tuning it for right now. Once you get it up and running you can measure 
its performance, profile it, and find the most effective ways to get 
the performance you need. And that's the goal, right? "As fast as 
necessary," not "as fast as possible." It may be that you can get away 
with straight Comanche to start, and then ramp up performance as your 
user base grows.

If it were me, I'd write the app in Seaside/Squeak, knowing that to 
dramatically increase performance, I have many options:
	- create a cluster of Squeak servers, and put a load balancer in front 
of it
	- write a VM plugin to optimize critical sections of code
	- port to VisualWorks and get a 10x speed increase from JIT compilation

Colin



More information about the Seaside mailing list