Character Controller Pro (1.4.x)
  • Introduction
  • The package
    • Content
    • Versioning scheme
    • Importing the package
    • Using the package
    • Known issues
  • Fundamentals
    • Core
      • Character
      • Character body
      • Character actor
        • States
        • Stable movement features
        • Velocity
    • Implementation
      • Character state controller
      • Character state
      • Character brain
  • How to...
    • Core
      • Create a basic character
      • Add behavior logic to your character
      • Move the character
      • Rotate the character
      • Leave grounded state (e.g. jump)
      • Change the size of the character
      • Disable collisions (ghost)
      • Use the character information (with examples)
      • Use root motion
      • Detect a character using a "detector"
      • Add a static one way platform
      • Add a dynamic one way platform
    • Implementation
      • Organize the character hierarchy
      • States
        • Add and configure the state machine
        • Create a state
        • Transition from one state to another
        • Handle animation
        • Modify an IK (inverse kinematics) element
      • Actions
        • Define your own actions
        • Use the character actions
        • Use a custom Input Handler
        • Use the new input system
        • Create your own AI movement logic
Powered by GitBook
On this page
  • Actions
  • Types
  • Character actions struct
  • Brain types
  • Human brain
  • AI brain
  1. Fundamentals
  2. Implementation

Character brain

PreviousCharacter stateNextCore

Last updated 1 year ago

CharacterBrain is a component responsible for handling all character actions, regardless of whether they come from the player or not.

If you are familiar with the Unity's "new" input system, you probably know what an action is. An action is just a link between an input signal (e.g. press the jump button) and an output result at the gameplay level (e.g. jump).

All the available actions are predefined in a structure and then updated by the CharacterBrain component at runtime. This approach create a level of abstraction between the inputs (GetKey, GetButton, etc.) and the character actions themselves (jump, move forward, etc.).

Actions

Types

CCP's Implementation supports three types of actions

Action type

Description

BoolAction

A toggle, this action can be pressed or not pressed.

FloatAction

A 1D Value from -1 to 1 (similar to GetAxisfrom Unity)

Vector2Action

A 2D Value from, basically a combination of two axis.

Note that the BoolAction value is true or false (a bool). If you need to know if that action was "started" or "canceled" (e.g. was the jump button pressed?), you need to get the Started or Canceled property respectively.

Character actions struct

These actions are predefined and grouped together inside a struct. Each public field represents a particular action.

public struct CharacterActions 
{
	// Bool actions
	public BoolAction @jump;
	public BoolAction @run;
	public BoolAction @interact;
	public BoolAction @jetPack;
	public BoolAction @dash;
	public BoolAction @crouch;
    
  //...
}

You can visualize the actions values in runtime in the inspector (Human or AI).

CCP includes a ScriptableObject asset whose only job is to modify this CharacterAction struct. The actions included by default were generated using the Default Character Actions asset.

Have in mind that adding/removing actions might affect the scripts from the Demo content since these scripts are using the default CCP actions.

Brain types

Human brain

If the brain is set to Human, actions will get updated based on input devices (keyboard, mouse, joystick, UI, etc). The component responsible for this is called InputHandler, and it must be implemented specifically for each input system needed.

By default, the brain allows the user to use two input handlers that come with the asset, or a custom one.

Human input type

Description

Unity Input Manager

It reads input data directly from Unity's Input manager (old system).

UI_Mobile

It looks for all UI-based input components in the scene (InputAxes and InputButton). These components are responsible for converting UI events into input values.

Custom

An InputHandler custom implementation.

AI brain

In this mode, the brain defines actions through code by using a AIBehaviour.

Representation of the Human, the AI and the brain.
Brain modes in the inspector.
Available human input types.
Example: AI behaviour using a sequence behaviour (Demo content).