Based on the terminology used in the image (specifically
"Perceived Actors," "Report Noise Event," and "Stimuli
Listener"), this graphic represents the Unreal Engine AI Perception
System.
While the image lists key Blueprint nodes and events
used to manage AI senses, it is missing the configuration, context, and data
structures required to actually make the system work.
Here is the specific data and context missing from this view:
1. The "Senses" Configuration (Inputs)
The image shows functions to handle perception, but it does
not show the Config settings that define how the AI perceives the
world. Missing data includes:
- Sight
Config: Sight Radius, Lose Sight Radius, and
Peripheral Vision Angle.
- Hearing
Config: Hearing Range and detection of specific
"Tags" (e.g., footsteps vs. gunshots).
- Affiliation:
Settings that determine if the AI detects Enemies, Neutrals, or
Friendlies.
- Max
Age: How long the AI "remembers" a stimulus after
it is gone.
2. The FAIStimulus Structure Details (Outputs)
The node "On Target Perception Updated" provides a
data struct called Stimulus, which is invisible here. That struct contains the
critical details the AI needs to make decisions:
- Was
Successfully Sensed: A boolean that tells you if the AI just
saw the target (true) or just lost sight of them (false).
- Stimulus
Location: Where the noise or visual contact occurred.
- Strength:
How strong the signal was.
- Receiver/Sensed
Class: Specifically which sense triggered
the update (Sight, Hearing, Damage, Touch).
3. The Counterpart Component (The Source)
The AI Perception System acts as a Receiver. The image
is missing the data regarding the Transmitter.
- AIPerceptionStimuliSource:
For "Get Perceived Actors" to return anything, the actors
(players or NPCs) must have a Stimuli Source component attached to them to
register them for specific senses (e.g., "I can be seen" or
"I can be heard").
4. Logic & Execution Flow
This is a static menu/list of available functions. It is
missing the Blueprint Logic (execution wires) that defines behavior,
such as:
- If
Noise Event is reported $\rightarrow$ Then Rotate AI to face
location.
- If
Perception Updated (Successfully Sensed) $\rightarrow$ Then Enter
Chase State.
To summarize, the image you provided displays the Event
Graph API (the triggers), but it is missing the Architecture (the
data flow) required to make those triggers function.
To build a working Perception System in Unreal Engine 5, you
must view it as a cycle of Transmission, Filtering, and Reception.
Here is the final summary of the complete system,
incorporating the missing data:
1. The Emitter (Missing "Stimuli
Source")
The functions in your image (like Get Perceived Actors) will
return nothing unless actors in the world are broadcasting their presence.
- What
is missing: The AIPerceptionStimuliSource
component.
- Role:
This component must be added to the Player or NPCs. It tells the engine,
"I am an object that can be seen," or "I am an object that
can be heard." Without this, the AI is "blind" regardless
of its settings.
2. The Filter (Missing "Sense Config")
The Perception System header in your image implies a generic
listener, but the system needs specific rules to filter data.
- What
is missing: The UAISenseConfig settings inside
the AI Controller.
- Role:
You must explicitly configure:
- Sight:
Radius (how far), Peripheral Angle (how wide), and Detection by
Affiliation (Enemy vs. Neutral).
- Hearing:
Range and Tag detection.
- Damage/Touch:
Sensitivity settings.
3. The Data Packet (Missing
"FAIStimulus")
The node On Target Perception Updated is shown, but the
critical data structure it produces is hidden.
- What
is missing: The FAIStimulus Struct.
- Role:
When the event fires, it provides a struct containing the vital logic
switches:
- SuccessfullySensed
(Boolean): This is the most critical missing
variable. It determines if the event fired because the AI found
the player (True) or lost the player (False).
- StimulusLocation
(Vector): The exact world space coordinate of the
sound or visual sighting.
- Strength
(Float): How "loud" or
"visible" the stimulus was.
4. The Memory (Missing "Blackboard")
The image shows immediate events (Report Noise Event), but
not where that data goes.
- What
is missing: The Blackboard and Behavior Tree
integration.
- Role:
Perception events are fleeting (milliseconds). The data must be instantly
stored in a Blackboard Key (e.g., TargetLocation or HasLineOfSight)
so the Behavior Tree can act on it over time (e.g., "Move to TargetLocation").
Summary Checklist
If you are implementing this system, you need to add these
three things to the nodes shown in your image:
- Stimuli
Source Component on the Player.
- Sense
Config (Sight/Hearing) on the AI Controller.
- Break
Struct logic on the On Target Perception Updated
node to check the SuccessfullySensed boolean.
Would you like me to generate the Blueprint logic for that SuccessfullySensed
check to distinguish between finding and losing a target?
This "Log Summary" acts as a great set of Golden
Rules for the AI Perception System, but like the previous image, it focuses
on concepts rather than the critical configuration values needed
to make those concepts work.
Here is the specific data missing from this summary that you
would need to actually implement the system:
1. The "Who" (Detection by Affiliation)
The summary states visual detection is automatic, but it
misses the most common reason detection fails: Affiliation Filters.
- Missing
Data: The Detection by Affiliation settings (Detect
Enemies, Detect Neutrals, Detect Friendlies).
- Why
it matters: Even with a perfect View Cone, the AI will
ignore valid targets if their "Team" doesn't match the checked
boxes in the Sense Config. By default in Blueprints, usually everything is
considered "Neutral," so you must explicitly check "Detect
Neutrals" for the system to fire.
2. The "When" (Max Age &
Forgetting)
The summary mentions "losing a target," but it
doesn't explain how the system decides a target is lost.
- Missing
Data: The Max Age variable.
- Why
it matters: This float value determines how long the AI
"remembers" a stimulus after it stops sensing it (e.g., after
you walk behind a wall).
- If
Max Age is 0, the AI forgets you the instant line-of-sight is
broken (causing jittery behavior).
- If
Max Age is Infinite, the AI will chase you forever even if you
teleport across the map.
3. The "Where" (Blackboard Integration)
The summary focuses on the event of sensing, but
misses the storage of that data.
- Missing
Data: Blackboard Keys (Vector and Object variables).
- Why
it matters: The Perception System updates are transient
(they happen once). To make an AI act on this information (like
chasing a player), the data from Stimulus Location must be saved into a Blackboard
Variable so the Behavior Tree can reference it later.
4. The "Specifics" of Hearing (Tags
& Range)
The summary correctly notes that hearing requires Report
Noise Event, but it misses the filtering mechanism.
- Missing
Data: Noise Tags and Range limitations.
- Why
it matters:
- Range: Report
Noise Event has a "Loudness" and "Max Range." If the
listener is outside that range, the event is discarded.
- Tags:
You can tag noises (e.g., "Footstep" vs. "Gunshot").
If the AI's Hearing Config doesn't have a matching tag, it effectively
"tunes out" that specific sound.
5. The Debug Tool
Finally, the summary implies you should trust the logic, but
misses the tool to verify it.
- Missing
Data: The AIPerception Debugger shortcut (apostrophe '
key while playing).
- Why
it matters: This renders the visual lines (green for
seeing, red for losing) that confirm if your "View Cone
settings" and "Stimuli Source" are actually aligning.
Would you like me to explain how to properly
configure "Detection by Affiliation" so your AI can distinguish
between different actors?
To provide the complete picture for your Unreal Engine 5
project, here is the final summary. This merges the Golden Rules from
your "Log Summary" image with the Technical Configuration data
that was missing but is required to make those rules function.
1. The Architecture (Setup)
- The
Rule (From Image): The AIPerception component goes on the
AI Controller, and the Stimuli Source component goes on the Actor
(Player/NPC).
- The
Missing Config: You must ensure the AIPerceptionStimuliSource
component on your target actor has the "Auto Register as
Source" box checked. Without this, the actor is technically
"invisible" to the AI, even if they are standing right in front
of it.
2. The Logic Switch (Memory & Forgetting)
- The
Rule (From Image): Always check the WasSuccessfullySensed
boolean to distinguish between "Found" and "Lost."
- The
Missing Config:
- Max
Age: The "Lost" event (where WasSuccessfullySensed
becomes False) is controlled by the Max Age float in the Sense Config. If
set to 0, the AI forgets the target instantly upon breaking line of
sight. Set this to 3-5 seconds to give the AI "object
permanence."
- Blackboard
Storage: This boolean switch is just a momentary
trigger. You must store the result (Target Location or Actor Object) into
a Blackboard Key immediately so your Behavior Tree can access it
later.
3. The Sense Filters (Sight & Hearing)
- The
Rule (From Image): Visual detection relies on View Cone
settings; Hearing relies on Report Noise Event.
- The
Missing Config:
- Affiliation
(Sight): The View Cone will ignore valid targets if
"Detection by Affiliation" is not set correctly. In
Blueprints, most actors are considered "Neutral," so you must
check the "Detect Neutrals" box in the Sight Config, or the AI
will be blind.
- Range
& Tags (Hearing): The Report Noise Event node has a Max
Range input. If the sound source is further away than this value, the AI
will never receive the event. You can also use Tags (names like
"Footstep") to filter specific sounds.
4. The Execution (Debugging)
- The
Rule (From Image): The Senses Config array allows mixing
multiple senses on one agent.
- The
Missing Config: To verify these are working together, use
the AI Debugger (press ' [apostrophe] while playing).
- Green
Sphere: The AI currently sees the target.
- Pink
Sphere: The AI has lost the target but still
remembers it (within Max Age time).
- Yellow
Sphere: The location of a heard noise.
Summary Table for Implementation
|
Feature |
The Concept (Image) |
The Required Data (Missing) |
|
Seeing |
View Cone settings |
Detection by Affiliation
(Check 'Neutral') |
|
Hearing |
Report Noise Event |
Max Range & Tag
matching |
|
Memory |
Check SuccessfullySensed |
Max Age (Time before forgetting) |
|
Storage |
N/A (Implied) |
Blackboard Keys
(Vector/Object) |
Next Step: Would you like a quick
checklist for setting up the Blackboard Keys properly so your Behavior
Tree can actually use this perception data?
No comments:
Post a Comment