14 November 2005 13:45
Game/AI: Great Expectation-Formations
"Forming expectations is a problem that is relatively easy to concretize: AIs have an internal knowledge model of the world. If the AI is able to provide an estimate for what that state is going to look like n ticks in the future, that’s an expectation – and naturally we’re going to want the AI to react not (or not just) to current world-state, but also to its expectation. React to its own expectation. I think that’s a neat idea, and architecturally it makes a lot of sense. Assuming we have a reactive behavior system of some sort already, we don’t, as a first pass, need to modify IT at all. All we need to do is provide future instead of current data. Great!"
1
(1 marks)