Last time I told you about the sight branch. This time I will explain the hearing branch. This branch is supposed to allow the stalker to hear the player and react to player-generated sounds. The stalker will move to the spot where sound comes from, and the AlertLevel is increased by one every time a sound is heard. One thing I also decided is that the AlertLevel would not go higher than 2 (the stalker would only reach level 3 when it actually sees the player).
This branch is using 2 Blackboard Decorators. The first one is to PlayerSeen as “Not Set” with an “Observer aborts” set to “self”; and the second one is set as PlayerHeard as “Set”. The idea is that this branch should run when the stalker hears the player, but only when the player is not seen. The reason is simple: if the enemy already saw the player and knows where the player is, it is not important if the enemy can hear the player since it already has visual contact. Since the Decorator “PlayerSeen” is set to “abort self” this branch is interrupted as soon as the stalker sees the player.
The first node of the branch also has a Service that increases the alert level. However, this is increased only once every time a sound stimulus happens. For example, say the enemy had an AlertLevel of 0, then hears the player for the first time, AlertLevel raises to 1, but it will remain at that value until it hears the player a second time, and, when that second time happens, the AlertLevel will go up to 2 (2 being the highest number it can go).
You see that I am changing AlertLevel from the PlayerSeen and PlayerHeard branches. This variable is mostly used to decide what the enemy will do during the Idle branch. Will the enemy be relaxed, or will it be patrolling and looking for the player? That is defined by this variable.
First, I perform a Cast To AIController to get the Player Heard Location, and then I use the Get Random Point In Navigable Radius to get a random point within a range of 500 units. However, then I perform a distance check between the Heard Location and the selected point to make sure they are at least 50 units away. The reason I do this is, because I want to keep a “void” around the Heard Location and pretty much get a search area that will resemble a donut around the Heard Location. If the distance between points is less than 50 units, another point is selected. If not, the task ends and the picked point is set as Search Location.
This task is only performed once, since the PlayerHeard will reset to false after a few seconds (the number of seconds depends on the “Max Age” setting on your AIPerception configuration for that sense).
To make this work, I changed the AIPerception functions a little bit, though. If you go back to the On Target Perception Updated event of my AIPerception (first part of the series) you see there’s a macro that checks if the sensed actor is the player or not. From the False output of the macro, I output to a couple of nodes, one that sets the sensed actor as the object, and another that calls a custom event called HeardOtherObject.
The HeardOtherObject event is similar to the heard player: it sets the ObjectIsHeard function, the ObjectHeardLocation vector variable, and the LastStimulusType variable. If you remember the previous article, these variables are read from the Stalker_AIController and set to the Blackboard.
Lastly, to make an object make a noise, I use this simple Blueprint using the Report Noise Event node.
This ends this part of the article. Next, I will explain the Idle branch. In the meantime, feel free to experiment with the Behavior Tree to see what else you can do with sound-based gameplay.
Get Unreal Engine: https://www.unrealengine.com/