I am curious how, in practice, a Swiki performs. In particular, how practical are these things for communities of several hundred people?
The current Swikis haven't been optimized much at all, but as Mark Guzdial posted earlier, they are handling his ~100 student sophomore class without trouble.
I can quantify this a little more. We did some quick tests on a Mac (Bolot, Je77, do you know what kind of Mac it is?), and found two things that are bottlenecks right now:
1. Reading a Swiki's source text, ie SwikiPage>>text, takes over 100 milliseconds by itself. This could obviously be improved: contentsOfEntireFile on the samy file took like 15-20 milliseconds.
2. "swikifying" a page is currently dominated by the check for whether a particular line break is in between < and >. This check is needed, but it could be made more efficient. With the check, the time taken is about 200 milliseconds. Without the check, the time is down in the 10-20 millisecond range, if I remember right. It was definately a LOT less.
So in summary, on this modest Macintosh:
1. Currently, it can sustain about 2 Swiki page requests per second, which is reasonable for a group of a few hundred people.
2. It can be easily improved to at least 20 regular requests per second (50 ms per hit).
Lex
On Mon, 3 May 1999, Lex Spoon wrote:
I am curious how, in practice, a Swiki performs. In particular, how practical are these things for communities of several hundred people?
The current Swikis haven't been optimized much at all, but as Mark Guzdial posted earlier, they are handling his ~100 student sophomore class without trouble.
I've used swiki's for 25-40 person classes for, gosh, over a year. It does ok. I have my own swikifying routine which avoids some of the speed hit Lex pointed out.
I can quantify this a little more. We did some quick tests on a Mac (Bolot, Je77, do you know what kind of Mac it is?), and found two things that are bottlenecks right now:
- Reading a Swiki's source text, ie SwikiPage>>text, takes over
100 milliseconds by itself. This could obviously be improved: contentsOfEntireFile on the samy file took like 15-20 milliseconds.
I'm going to change that tomorrow! Whoa. This must absolutely *kill* on searchs. And it certainly explains the interminable restore times.
- "swikifying" a page is currently dominated by the check for
whether a particular line break is in between < and >. This check is needed, but it could be made more efficient. With the check, the time taken is about 200 milliseconds. Without the check, the time is down in the 10-20 millisecond range, if I remember right. It was definately a LOT less.
Wow. I didn't realize the check was *that* painful. And here I thought it was my clever parser that made mine so much quicker. <wince/> Good to know though.
However, there seems to be other places where Swiki's can be a tad problematic: If someone else is accessing a page and PWS is processing (like doing a search) I get a "server unavailable". This is with Squeak 2.1. If this has been fixed (via threading?) in later versions I'd be happy except for the weird slowdowns I experience :(
One thing I think would nice is if the text and the "metadata" of a Swikipage were stored in separate files. Right now, you only save things like name, edit time, etc. when you edit the page. Thus, if your server crashes you lose all metadata since the last edit or image save.
On the other hand, I'm eager to see some sort of PWS connection to MinneStore. Now *that* will be interesting.
Cheers, Bijan
On Mon, 3 May 1999, Lex Spoon wrote:
I am curious how, in practice, a Swiki performs. In particular, how practical are these things for communities of several hundred people?
The current Swikis haven't been optimized much at all, but as Mark Guzdial posted earlier, they are handling his ~100 student sophomore class without trouble.
One solution which I have been using is the excellent CachedSwikiAction, which writes copies of the rendered HTML files to disk at the time they are edited, and allows them to be served by Apache. This enables me to run a swiki on hardware which otherwise would be a bit on the slow side (some old Sun IPXs), as well as allowing more than one machine to serve pages in parallel. ( I still haven't worked out how to get more than one machine to *edit* swiki pages in parallel :)
One thing I think would nice is if the text and the "metadata" of a Swikipage were stored in separate files. Right now, you only save things like name, edit time, etc. when you edit the page. Thus, if your server crashes you lose all metadata since the last edit or image save.
On the other hand, I'm eager to see some sort of PWS connection to MinneStore. Now *that* will be interesting.
Cheers, Bijan
I would have to agree that the current method of storing swiki pages is not ideal - it makes it hard to do things like merge pages into new swikis, rename swikis etc. I had been toying with ideas such as storing in XML/HTML etc, but linking PWS into Minnestore is a wonderful idea! What I would really like to see a PWS where URLs refer to objects (on or off image) which know how to serve themselves...
cheers,
Russell
---------------------------------------- Russell Allen
russell.allen@firebirdmedia.com
----------------------------------------
squeak-dev@lists.squeakfoundation.org