Monday, February 27, 2012

Weekly Update: 2/27


After speaking with Igor Karpov we've decided to start working on adding sensors to our bot. We'll start with rays - vectors emitting from the character in specific directions that collide with any nearby objects (environment or other players) and can return data on where they collided. Picture's worth a thousand words (click for fullscreen):



Those red lines are the rays. And if you peek at the bottom left into the console of Net Beans you can see the hitLocation data being retrieved from the rays. Our hope is to use data like that to determine our bot's next move, based upon how a human would respond to the same data. And to determine how a human would respond to the data, we are going to need to apply similar rays to a human player's character in combat and record the rays' data and see how they compare to the human's decisions.

Applying rays to a human-controlled player character is our next task. We will probably take a round-about method: our current plan is to record a human's actions in a game and then have a bot later recreate those actions with raycasting applied to the bot (there doesn't seem to be an easy way to directly apply raycasting to a human-controlled character).

No comments:

Post a Comment