So close... to proto!
In the works is grid pathing for wandering and fine-grained seeking behavior from AI. Almost done and tested.
Player, left, is standing around while AI, right, is wandering.
AI, right, has become hungry and spotted the Player, run P run!
Player, left, is trying to hide but thanks to sensory memory the AI has an idea where he is...
As you can see the prototype is in a very 'low level debug' state, this allows me to tweak and play arround with concepts and test changes to the engine. Also the LUA integration is paying off a lot for tweaking and flexibility.
My next objective is going to be the scripting of more game mechanics like random Zombie placement and game states. This is going to be handled by a Commander.
Only forward...
2 comments:
Awesome, very cool. What type of sensory input does the AI agents have? How do they track the player?
At the moment it is just visual LOS with FOV, but smell and audio is very easy to add and just feeds off of the visual data with fewer constraints... or more depending on how complex you want to go.
They could track by interception or direct-follow but I want to give the Zombie feeling so they do a direct-follow when going after food. :)
Post a Comment