[Seaside] GOODS best practice data storage

Sebastián Sastre ssastre at seaswork.com.ar
Thu May 20 21:28:50 CEST 2004


> Committing 13000 objects at once is definitely going to be slow.  If 
> you have a use case where you need to be committing thousands of 
> objects on a regular basis, then I don't know what to tell 
> you (except 
> to make sure that you're using a recent version of my GOODS client; 
> this was sped up a lot a few versions ago, but still not enough that 
> you'd want to be doing it during interactive use).  I bet we 
> can speed 
> it up by 2 or 3x through profiling, but I doubt we can do much more 
> than that.

Avi, I really don't want to commit 13000 new objects every time. Right
now I'm at the migration process. I can migrate them in a large process
once and then use them. 

I think the way the objects are stored is the key.

At first I was experimenting this:

	items := RDBMSDatabase allItems.
	db := KKDatabase onHost:'voyager' port:6101.  
	db root: Dictionary new;commit.  
	db root at:#items put:OrderedCollection new.  
	1 to: items size do:[:i|
		item := items at:i.
		(db root at:#items) add: item.
		db commit].

then I was using this:
	
	items := RDBMSDatabase allItems.
	db := KKDatabase onHost:'voyager' port:6101.  
	db root: Dictionary new;commit.  
	db root at:#items put:Dictionary new.  
	db commit.
	1 to: items size do:[:i|
		item := items at:i.
		(db root at:#items) at: item identifierCodeString put:
item.
		db commit].

In both cases, the transaction time (each one) is incremental so the
whole process is delaying and delaying more and more each time the
dictionary grows. 

I'm using the dolphin port based on your squeak goods client version
downloaded in march 2004. Ti's available on:
http://gatekeeper.dynalias.org:8888/GoodsST/uploads/4/GOODS_Client.6.zip


> > 	1)  How can I make a better use of GOODS? and
> >
> > 	2) How can I tune it or index it?
> 
> These are hard questions to answer in the abstract.  What are your 
> actual needs?  What are you trying to accomplish?

My needs are pretty normal, I think. I have a catalogue with near 50K
objects, 13000 of them are more loaded (composed with another objects:
reservations, stock availability, equivalences, fractions, offers, etc).

The main application consults the catalogue and make transactions buy,
sells, manages requests, etc.

Right now I'm using a RDBMS with a framework to map Smalltalk classes,
and I'm planning to migrate to a OODBMS. But the results until now are
not what I've expecting.

Please If you see a more efficient way of using this client let me know.

Thank you,

Sebastián Sastre
ssastre at seaswork.com.ar
www.seaswork.com.ar


> 
> Avi
> 
> _______________________________________________
> Seaside mailing list
> Seaside at lists.squeakfoundation.org
> http://lists.squeakfoundation.org/listinfo/seaside
> 




More information about the Seaside mailing list