Neural nets was: Re[2]: Info on Smalltalk DSLs or
Metaprogramming...
Herbert König
herbertkoenig at gmx.net
Mon Sep 4 20:44:51 UTC 2006
Hello Rich,
RW> I don't really know anything about FANN. I was planning on just
RW> implementing my own (they're not that hard). I'm torn because FANN
no they aren't, so I ended up doing my own too.
RW> Portability is probably more important than speed at this point, so I
RW> may just implement my own.
There is a Squeak implementation of a two layered perceptron with
backpropagation learning by Luciano Notarfrancesco, I decided against
it for it was too slow. Much better Smalltalk and OO though.
I needed speed in computing the outputs and FloatArray is fast in #*
and #sum.
RW> The project I'm working on is actually an aLife simulation. The
I'll take a look, when I'm online again.
RW> neural nets would only be a small portion (the brains of the agents).
RW> I like the idea of using them, since the weights can be learned
RW> genetically (over several generations) as well as modified within a
RW> single generation (using reinforcement learning).
This (reinforcement learning) is said to be slow. What number of
inputs and how many neurons would such a brain have? How many agents?
With 500 epochs of 400 samples training of a single Perceptron of 64
hidden and 16 output neurons took over an hour on a 1.8GHz Pentium M.
It had 140 inputs.
Do you have any pointers on how to use genetic algorithms on neural
nets? More practical, I'm an EE not a CS person :-)
Cheers,
Herbert mailto:herbertkoenig at gmx.net
More information about the Squeak-dev
mailing list
|