I much prefer 1.0 to 0.999...9. Thanks Nicolas.
Here is a better argumentation from David M. Gay in "Correctly Rounded Binary-Decimal and Decimal-Binary Conversions".
Why would anyone care how long a computer takes to print floats? If it's talking to other machines, they can speak in binary: the format is more standard than the printed ones. If humans are listening in, any machine is fast enough.
Rodney
On Mon, 7 Nov 2011, Rodney Polkinghorne wrote:
I much prefer 1.0 to 0.999...9. Thanks Nicolas.
Here is a better argumentation from David M. Gay in "Correctly Rounded Binary-Decimal and Decimal-Binary Conversions".
Why would anyone care how long a computer takes to print floats? If it's talking to other machines, they can speak in binary: the format is more standard than the printed ones. If humans are listening in, any machine is fast enough.
You can't always decide how your program will communicate with other programs. Creating high amount of human readable documents (e.g. XML) or just a few, but with lots of floats in them is also pretty common.
Levente
Rodney
squeak-dev@lists.squeakfoundation.org