AI project: "world state" seems to be required.

Alan Grimes squeak-dev at lists.squeakfoundation.org
Mon Sep 16 06:48:45 UTC 2002


appologies again for the long turn-around... It can be expected that
this will be a continuing thing as I work on my projects off and on... 

> > My previous question about how to "start the clock" of world morph 
>> and cause it to function so that it can support normal squeak 
>> functionality (or some restricted version thereof) remains open.

> Why not just hook into the World state of the current project? 

I want the AI's playpen to be a subset of the user's range. The user
must have full access to the system where the AI has access to some
truncated portion of it. It is also highly desirable to control the
extent to which the AI can observe the user's actions so that the user
doesn't inadvertantly give the AI some knowlege that will allow it to
hack the system. 

> The user is interacting with the whole World, not just a PasteUpMorph 
> in the World, so it makes sense for your AI's input/output to be 
> through the World (note: capitalization of "World" refers to the global
> variable with that name)

Yes, and as I just stated, there is also a requirement that the AI's
world be a subset of the full world. 

> The World already has a world state ticking, so you don't have to
> worry about the hassles involved with hooking up your World with
> mouse/keyboard input, etc.  Just use the one that is already there.

Hmm... I _WILL_ be implementing a virtual keyborad and mouse for the AI,
so I will need to wory about hooking the AI's interfaces up to the world
anyway, so anything I need to do with the outside user's hardware
shouldn't create much of a complexity overhead at all...

> Here's how I might approach the implementation:

> You could make a subclass of PasteUpMorph, AIPasteUpMorph, that could
> be used for the World of newly created Projects.  An item in the World
> menu could tell the AI to become active or inactive.  When the
> AIPasteUpMorph initializes itself,
>>>>>
 it does everything necessary to begin to sense input from the user,
munch on it, and react to it.
<<<<

The AI looks at whatever is being displayed in the AI-world window
exactly as a user would, by looking at pixels and connecting dots... 

> You might also have to subclass WorldState, and perhaps HandMorph.
> Take a look at the method WorldState>>doOneCycleNowFor:.  The line 'h
> processEvents' is where your AI should tap into the user input.  The
> AI's actions might occur between 'aWorld runStepMethods' and 'self
> displayWorldSafely: aWorld'

The AI doesn't process user inputs. The AI world processes the AI's
"inputs". though input and output become rather circular in this
context...

-- 
Latency is your enemy.
Bandwidth is your friend.
http://users.rcn.com/alangrimes/



More information about the Squeak-dev mailing list