[Seaside-dev] Encoding problems with binary vs ascii

Lukas Renggli renggli at gmail.com
Tue Jun 16 05:52:16 UTC 2009


> The encoder is initialized with a non-binary write stream, then it's told to
> become binary. You can't do that - the encoder has no way of knowing what's
> inside its inner stream, nor should it. If you intend to put bytes in to the
> stream, start it with a ByteArray.

The most important thing to consider is to avoid copying streams
around. The current implementation is built so that Seaside can
directly stream onto the socket, there is no copying and conversion in
the server adaptor.

Since the target stream is not initialized by Seaside you don't have
control over its creation. Also, you wouldn't know what the client
wants to do with the request upfront anyway. #binary is used to
disable any encoding and transfer raw data, #ascii is the default and
does the encoding set in the server adapter.

Lukas

-- 
Lukas Renggli
http://www.lukas-renggli.ch


More information about the seaside-dev mailing list