[Seaside] Zinc - Twitter - Streaming API - Long lived connections

radoslav hodnicak rh at 4096.sk
Fri Oct 21 11:57:14 UTC 2011

On Fri, Oct 21, 2011 at 1:43 PM, Davorin Rusevljan
<davorin.rusevljan at gmail.com> wrote:
> On Fri, Oct 21, 2011 at 1:07 PM, radoslav hodnicak <rh at 4096.sk> wrote:
>> On Fri, Oct 21, 2011 at 12:14 PM, Igor Stasenko <siguctua at gmail.com>
>> wrote:
>> > when i worked on it, there was no Zinc yet, so i had to implement a
>> > bit simplified HTTP protocol
>> > (enough for using with CouchDB) by own over plain socket connection.
>> Does the json parser support streaming? I have some big couchdb
>> results to work with and so far they fit into the memory, but they
>> might not always do that.
> How would you interface to such json parser? I mean how would it partition
> result into smaller chunks that it can return to you? Probably it would fire
> event for each subnode parsed, but I doubt current parsers are equiped with
> that.

couchdb results have a defined structure, one of the attributes is
"rows" with contains the actual data, so it would stream the rows one
by one.

> Maybe refined map-reduce functions on the couchdb server could help avoid
> such large jsons?

sure, there are various ways how to limit the data or split it into
chunks on the couchdb side. but sometimes it would be easier to just
say "hey give me everything you've got and i'll chew through it". I
run into these situations quite often when I neech to update all
documents (or a subset of documents) for whatever reason, and you only
can do that one by one in couchdb.


More information about the seaside mailing list