Neural nets was: Re[2]: Info on Smalltalk DSLs or Metaprogramming...

Rich Warren rwmlist at gmail.com
Mon Sep 4 04:55:05 UTC 2006


Hi,

I don't really know anything about FANN. I was planning on just  
implementing my own (they're not that hard). I'm torn because FANN  
might save time, but it would also make the code less portable (I'd  
need to have the FANN library installed on any computer that planned  
on running the code).

Portability is probably more important than speed at this point, so I  
may just implement my own.

The project I'm working on is actually an aLife simulation. The  
neural nets would only be a small portion (the brains of the agents).  
I like the idea of using them, since the weights can be learned  
genetically (over several generations) as well as modified within a  
single generation (using reinforcement learning).

-Rich-

On Sep 2, 2006, at 9:41 PM, Herbert König wrote:

> Hello Rich,
>
>
> RW> Here's an example of what I'm talking about. I'm about to  
> implement a
> RW> neural net. I'll be experimenting with many different sizes and
> RW> topographies. This is an excellent opportunity for a DSL. I could
>
> I'm in the process of implementing neural networks too. I'd like to
> share thoughts (here or off list). This would force me to write down
> something:-)
>
> I got stalled due to business demands but hope to find time to put
> something on SM some time. From this thread I guess that you should
> try the squeak binding to the fann library (fast artificial neural
> networks).
>
> Didn't try myself but sounds promising.
>
> The most interesting of my own experiments is a self organising
> feature map to cluster the information for a subsequent net. Maybe
> because I made a graphic trainer, I'm easily impressed by
> graphics :-))
>
> Cheers,
>
> Herbert                            mailto:herbertkoenig at gmx.net
>
>




More information about the Squeak-dev mailing list