[Newbies] Re: Tim's Fix for LargeIntger>>AtRandom
peace_the_dreamer at yahoo.com
Mon Aug 4 07:16:27 UTC 2008
Re: Tim's Fix for LargeIntger>>AtRandom
was: [Newbies] Re: What's wrong with this statement?
Glad you are working on a fix.
I looked at your first pass.
It looks like it would work but maybe eat performance.
As Randal mentioned it goes thru Randoms
siz and a half times faster than necessary.
What happens if you try it with the oddity test I put in the report?
Does it work well enough to pass that test?
(The data looks ok. Some even some odd. Cool).
Feel free to strengthen the test by adding more cases.
I would leave the method in Integer as it was but add one in LargePositiveInteger.
This would use the heirarchy of classes to make the distinction that is needed.
SmallInts and negative large ints would use the current method.
The method for large ints would look similar to the way you broke it down.
Do a small piece and then the remaining piece.
Instead of converting to byte arrays I would just use arithmetic for the recursion.
We can choose a large power of two for the radix because each random contains
about 2^50 bits of information.
LargeInt can probably divide efficiently by powers of two.
We don't have to worry about this on
the first (uhm, second pass. You did a first pass :-).
Once you have a pass that works we could test the next one against it
by using a separate generator and noting a beginning seed.
(waves hands in the air.) I think I am getting ahead of myself here.
outline of nem method:
radix := <a truly randomizable power of 2>
self > radix ifFalse: [ ^ aGenerator nextInt: self ] .
loBits := self \\ radix .
hiBits := self // radix .
^ hiBits atRandom * radix + loBits atRandom .
(this is a gedanken code to sketch things out.
You may have to vary it to get it to work.)
I would get something like the above working first.
Test it against at least the oddity test.
Then maybe play with performace IF you suspected that would be important.
If the number is negative the interger method will handle that as it already does.
If any of the numbers become Smallints thats covered by what gets done now.
For large ints the problem gets broken down to reasonable levels and dealt with.
Yours is curiosity and service, --Jerome Peace
>Somebody check me out here, 'cause I think I'm close but something
>doesn't seem quite right.
>Attached is a new Integer>atRandom I'm playing with. I break the
>integer into a ByteArray and call #atRandom for every byte, using 255
>as the bounding value for all but the first byte. The bounding value
>for the first byte is itself minus 1. Concatenated together we get a
>byte array representing a random value less than the bounding value.
>I think. Maybe I'm doing this wrong. I feel like the first byte
>treatment is wrong but I don't know how to achieve the same
>result--without it the random value often exceeds the bounding value.
>I went this route as it avoids doing a bunch of bit shifts I'd
>probably get wrong anyway.
>10 timesRepeat: [Transcript show: ((2 raisedTo: 128) atRandom hex); cr]
>On Sat, Aug 2, 2008 at 11:00 AM, Cerebus <cerebus2 at gmail.com> wrote:
>> Bug submitted.
>> I may even attempt a fix. :)
>> -- Tim
More information about the Beginners