AI senses and a status update
by Daniel Buckmaster · 12/27/2012 (5:28 am) · 1 comments
It feels like it's been too long since my last blog. I always promised a closer look at how I was planning on implementing AI senses, and although I haven't made any progress on that AI stuff for a while, I think it's still worth going over. I really do hope you guys enjoy me just talking about my code; unfortunately, this post has even less screenshots than usual! I've also got a bit of news related to why I won't be making much progress on anything for the next couple of months, but I'll save that 'til the end. So, without further ado:
Currently, a Sensor gets attached to every AI character I spawn. I manage this through the Actor script class, and the onAdd function looks something like this:
By setting the `object` property of the Sensor, the Sensor is directed to follow the AIPlayer around - in effect, a form of mounting. I really should just use the actual mounting system, I guess, but it isn't really to my liking, and this object is used for more than mounting anyway. The `callbackObject` member is similar but different (most informative sentence ever). This specifies the object that should receive the Sensor's callbacks - i.e., function calls when a new object comes into view, etc. You might think that this should be the same as the `object`, but in this case, I send all callbacks to the datablock instead. This is just handier as it allows different datablocks to react to sensor events in different ways.
So what's happening here is that I'm specifying a bunch of rules for this sensor to use. When a sensor tries to detect a target, it invokes each of these rules, from 0 to n, which results in a score between 0 and 1 for each rule. All these scores are multiplied together to give the final visibility of the object. Of course, if any rules give a 0, then the whole process can stop since the answer will always be 0. This allows you to optimise the sensor by putting expensive rules (like raycasts) at the end.
You can see that I'm constructing the rules of each datablock using strings. This was about the most generic way I could think to provide a clean interface for doing this sort of stuff, since I didn't want to have to add a bunch of different members to the SensorData block that are specific to some rules and unused by all the rest. It's a good way to waste space. So instead, I devised a clever system for defining and declaring rules...
This is all the code - no need for separate entries in the header file, or other code elsewhere to make the SensorData block aware of the binding between "Distance" rules and this class. That's all handled by the Define/ImplementSensorRule macros. The `init` method simply takes the string that defines the rule, and tries to parse sensible data out of it. In this example, when we wrote "Distance 50", the `init` method receives "50", and obviously tries to convert that to a distance. Other rules parse their strings differently.
The `check` method is responsible for generating the score of an object under this rule. It gets given four parameters: the object doing the sensing (given by the `object` parameter of the Sensor object, above), and a transform for this object (in case, for example, we want to do sensing from the eye transform instead of the render transform), an object to try to detect, and a transform for it. In our case, we get the distance between the two objects, and return some calculation based on it. What I've done is scale the result so that if there's no distance between the objects, the function returns a 1. If the two objects are 50 metres apart, it returns a 0. Between that, the score is interpolated. This means more distant things are less visible.
All the other rules I've implemented so far adhere to the same policy of being a sliding scale of visibility. Some rules, of course, can't provide a scalar value - for example, a single raycast can only return a binary 0 or 1.
Mostly, I did this by making every Sensor a Trigger. This is where the `engagementRange` member of SensorData comes in. The Sensor creates a large trigger box of that radius, and waits for potentially sense-able entities to collide with it, calling its potentialEnterObject method. How neat is that? Effectively, when a Player moves and collides with a Sensor, the Player notifies the Sensor that he's coming, instead of each Sensor constantly having to check up on all the Players. If there are no Players moving into any Sensors, then nobody needs to be bothered doing any checks.
Where the ticking comes in, of course, is objects who enter the Sensor's vicinity and need to be kept track of. Hitting the trigger doesn't guarantee an object will be visible, of course! In earlier iterations of the code, I had a complex system for determining how 'interesting' each contact was, based on distance and visibility, and more interesting contacts would get a full update more often. Currently, I just let the Sensor tick all its current contacts every time. This will mean more precise sensing - sudden changes in a contact's visibility, like stepping around a doorway, will be represented - but of course, a performance hit.
Anyway, this uncomfortable dry spell of updates is going to have to continue, unfortunately, for a little while longer. After planning to have this whole sumer at home in Sydney (for the first time since I moved there!), an opportunity has come up for me to travel to the USA in February to help pitch a small startup I've been working with to investors. It's going to be pretty unreal! So January will basically be a hardcore sprint to meet our technology goals before we show it off to VCs, and it will basically consume my entire life.
So, it's been nice knowing you. See you on the other side. Oh, and Walkabout still lives here, in case you were wondering. And didn't want to ask in case I pulled your throat off.
Sensors
Yes, that is sensors, not senses. I decided early on that when I was working on senses for AI characters, I wanted something very modular, rather than programming lots of logic into AIPlayer or writing bespoke scripts for each class of AI character. In the end, I decided upon a separate Sensor object that could be attached to any object that needed to sense something. This way, I can reuse the same class for things like sentry cameras, landmines, etc. Though for a landmine it could be slight overkill.Currently, a Sensor gets attached to every AI character I spawn. I manage this through the Actor script class, and the onAdd function looks something like this:
function Actor::onAdd(%this, %obj)
{
if(%obj.getClassName() $= "AIPlayer")
{
if(%this.sensorData !$= "")
{
%obj.senses = new Sensor() {
datablock = BasicSensor;
object = %obj;
callbackObject = %this;
};
}
}
}By setting the `object` property of the Sensor, the Sensor is directed to follow the AIPlayer around - in effect, a form of mounting. I really should just use the actual mounting system, I guess, but it isn't really to my liking, and this object is used for more than mounting anyway. The `callbackObject` member is similar but different (most informative sentence ever). This specifies the object that should receive the Sensor's callbacks - i.e., function calls when a new object comes into view, etc. You might think that this should be the same as the `object`, but in this case, I send all callbacks to the datablock instead. This is just handier as it allows different datablocks to react to sensor events in different ways.
The SensorData block
Of course, the SensorData datablock is where things get interesting, so I thought what I'd do is just show you one to demonstrate how a sensor type is constructed. The code below can be found in slightly modified form on GitHub.datablock SensorData(BasicSensor)
{
engagementRange = 50;
typemask = $TypeMasks::ShapeBaseObjectType;
// Discount everyone outside our FOV
rules[0] = "Angle 110";
// Only see up to 50 metres
rules[1] = "Distance 50";
// Now do a sort of squished-cone check to degrade vision vertically
rules[2] = "Angle2D 90 30";
// If the object is outside the cone, its visibility is only thirded at most
range[2] = "0.3 1.0";
// Finally, do a raycast so we can't see through walls
rules[3] = "Raycast 1" SPC $TypeMasks::StaticObjectType;
};So what's happening here is that I'm specifying a bunch of rules for this sensor to use. When a sensor tries to detect a target, it invokes each of these rules, from 0 to n, which results in a score between 0 and 1 for each rule. All these scores are multiplied together to give the final visibility of the object. Of course, if any rules give a 0, then the whole process can stop since the answer will always be 0. This allows you to optimise the sensor by putting expensive rules (like raycasts) at the end.
You can see that I'm constructing the rules of each datablock using strings. This was about the most generic way I could think to provide a clean interface for doing this sort of stuff, since I didn't want to have to add a bunch of different members to the SensorData block that are specific to some rules and unused by all the rest. It's a good way to waste space. So instead, I devised a clever system for defining and declaring rules...
Rules
Basically, when a SensorData block is loaded, it converts each string into a rule object. Each rule type (Distance, Angle, etc.) is a distinct subclass of the SensorRule base class. I put a bit of complex work into making sure it was super-easy to define new rule types, using two macros similar to the engine API class definition macros we already use. Below is the definition of the Distance rule class from GitHub.class DistanceRule : public SensorRule {
/// Distance.
F32 dist;
DefineSensorRule(DistanceRule);
public:
bool init(const String &data)
{
dist = dAtof(data.c_str());
return dist > 0.0f;
}
F32 check(SceneObject *obj, const MatrixF &trans, SceneObject *other, const MatrixF &otrans) const
{
F32 distance = (otrans.getPosition() - trans.getPosition()).len();
return 1.0f - mClampF(distance / dist, 0.0f, 1.0f);
}
};
ImplementSensorRule(DistanceRule, Distance);This is all the code - no need for separate entries in the header file, or other code elsewhere to make the SensorData block aware of the binding between "Distance" rules and this class. That's all handled by the Define/ImplementSensorRule macros. The `init` method simply takes the string that defines the rule, and tries to parse sensible data out of it. In this example, when we wrote "Distance 50", the `init` method receives "50", and obviously tries to convert that to a distance. Other rules parse their strings differently.
The `check` method is responsible for generating the score of an object under this rule. It gets given four parameters: the object doing the sensing (given by the `object` parameter of the Sensor object, above), and a transform for this object (in case, for example, we want to do sensing from the eye transform instead of the render transform), an object to try to detect, and a transform for it. In our case, we get the distance between the two objects, and return some calculation based on it. What I've done is scale the result so that if there's no distance between the objects, the function returns a 1. If the two objects are 50 metres apart, it returns a 0. Between that, the score is interpolated. This means more distant things are less visible.
All the other rules I've implemented so far adhere to the same policy of being a sliding scale of visibility. Some rules, of course, can't provide a scalar value - for example, a single raycast can only return a binary 0 or 1.
Lazy sensors
The second thing I immediately decided when designing the Sensor class was that I wanted it to be as lazy as possible. There's no help in a Sensor sitting in the middle of nowhere, polling ContainerRadiusSearches every tick if it's not going to find anything. I am a strong believer in event-driven systems, as I wrote about in my last AI blog, to the extent that I still haven't implemented any ticking in my BehaviorManager. Unfortunately, some ticking is unavoidable in this situation - and possibly even desirable. But I tried to limit it as much as possible.Mostly, I did this by making every Sensor a Trigger. This is where the `engagementRange` member of SensorData comes in. The Sensor creates a large trigger box of that radius, and waits for potentially sense-able entities to collide with it, calling its potentialEnterObject method. How neat is that? Effectively, when a Player moves and collides with a Sensor, the Player notifies the Sensor that he's coming, instead of each Sensor constantly having to check up on all the Players. If there are no Players moving into any Sensors, then nobody needs to be bothered doing any checks.
Where the ticking comes in, of course, is objects who enter the Sensor's vicinity and need to be kept track of. Hitting the trigger doesn't guarantee an object will be visible, of course! In earlier iterations of the code, I had a complex system for determining how 'interesting' each contact was, based on distance and visibility, and more interesting contacts would get a full update more often. Currently, I just let the Sensor tick all its current contacts every time. This will mean more precise sensing - sudden changes in a contact's visibility, like stepping around a doorway, will be represented - but of course, a performance hit.
Unanswered questions
Yes, there's still a few things to do before this will be a completely usable sensor system. Most important among them, IMO, is sound. I plan to go with a similar event-based sound system, using T3D's built-in message routing classes. Also high on the list is creating a scripting framework around these sensors for what I believe to be the more important task than detection: identification. Once a character has seen something, it must determine what it has seen. Is that a friendly walking by at a distance, or an enemy to attack? An unarmed civilian?Walkabout
Yes, I feel the need to bring this up because now that I'm actually the proprietor of commercial software, I have a responsibility to my customers! Which for the moment, I feel I have been sadly neglecting. I guess I should be pleased the the official Walkabout help thread has no replies, but in my mind that suggests not that everyone is completely and totally satisfied, but that they're all suffering silently, too terrified to approach me for fear of violent reprisal.Anyway, this uncomfortable dry spell of updates is going to have to continue, unfortunately, for a little while longer. After planning to have this whole sumer at home in Sydney (for the first time since I moved there!), an opportunity has come up for me to travel to the USA in February to help pitch a small startup I've been working with to investors. It's going to be pretty unreal! So January will basically be a hardcore sprint to meet our technology goals before we show it off to VCs, and it will basically consume my entire life.
So, it's been nice knowing you. See you on the other side. Oh, and Walkabout still lives here, in case you were wondering. And didn't want to ask in case I pulled your throat off.
About the author
Studying mechatronic engineering and computer science at the University of Sydney. Game development is probably my most time-consuming hobby!

Associate Steve Acaster
[YorkshireRifles.com]
Agreed, devouring processor resources for no reason is a bad, bad thing. Too many people seem to be consumed with high-end functions rather than practicality - the same goes with graphics/models.
No complaints on Walkabout, it seems to work nicely. :)