January 2006
November 2005
![](http://blogmarks.net/screenshots/2005/11/14/989b56368b0c2ebc77cdf4dd10e18da4.png)
Game/AI: Great Expectation-Formations
![](http://blogmarks.net/img/default-gravatar.gif)
"Forming expectations is a problem that is relatively easy to concretize: AIs have an internal knowledge model of the world. If the AI is able to provide an estimate for what that state is going to look like n ticks in the future, that’s an expectation – and naturally we’re going to want the AI to react not (or not just) to current world-state, but also to its expectation. React to its own expectation. I think that’s a neat idea, and architecturally it makes a lot of sense. Assuming we have a reactive behavior system of some sort already, we don’t, as a first pass, need to modify IT at all. All we need to do is provide future instead of current data. Great!"
1
(2 marks)