Technical Document

fAutonomy – AIBrain’s Unity AI Plugin

Introduction

About the Plugin

fAutonomy (read as: Photonomy), AIBrain’s Unity AI Plugin, allows you to implement
Deep Neural Network (DNN) controlled intelligent AI agents within your Unity game.

fAutonomy is an AI Plugin for the Unity Editor.

These ‘agents’ would typically be Non-Player Characters (NPC), but they can also be used for many other purposes – wherever you need intelligent responses to a game player’s actions. For example, in addition to NPC behavior, quest structures could also be controlled by fAutonomy to select which quests to offer the player based on their previous interactions.

The Unity plugin has 3 main functions:

  1. It allows you to design AI Behaviours you require for your game.
  2. It allows you to connect your Unity GameObject-s within a Unity scene with the various AI Behaviours you design by extending your Unity scene to an AI Scene.
  3. It communicates with the fAutonomy AI servers in order to train Deep Neural Networks implementing the AI Behaviours you have designed.

Welcome to the new world of Deep Learning based game AI!

Getting Started

To get started you will first need to import the fAutonomy Plugin into your Unity project.

To do this double-click the .unitypackage file, and import all the contents of the package into your Unity project.

Import Unity Package

Also, please attach the Assets/fAutonomy/FA.cs script as a component to all GameObject-s in your Unity scene, which you may want to use with fAutonomy.

Once you have done this you are ready to begin the AI Scene Setup, as detailed in the next steps of this documentation.

Key Concepts
Before we begin, there are several key concepts of the plugin that you will need to understand to use the tool effectively.

AIEntityTypes
AiEntityTypes are the definitions of the types of objects that the AI Model will use to internally represent your game world. The definition of a type specifies both the properties of the type and how each type relates to the other types (i.e. the type hierarchy).

There are two very important pre-defined AIEntityTypes: AIEntityTypeAgent and AIEntityTypeNonAgent. Agents are objects within your game that will have intelligent behaviour and non-agents are non-intelligent objects that the agents need to be aware of in order to perform their decision making processes.

As previously stated AIEntityTypes are part of the AI Model and therefore do not directly exist in C# class form.

AIEntity

An AIEntity is an abstract C# class that your game objects will need to implement in order to provide data to the AI Model.

Typically, for each AIEntityType you define as part of the AI Model you will need to create a C# class derived from AIEntity or AIEntityAgent. These C# classes will be added as components to your game-objects (as part of the GameObject Mapping step) to provide the link between the AI Model and your game objects.

AIEntity is the class you should implement for non-agent game-objects and AIEntityAgent is the class you should implement for agent game-objects.

The next sections will go through creating and setting up these types and classes

AI Scene Setup

AIEntityType Hierarchy Setup

AIEntityType Hierarchy

This screen is where you specify the AIEntityTypes you wish to use in your AI.

Navigation

To navigate the canvas you can use the following mouse controls:

  • Left-click and hold to pan around the canvas.
  • Scroll the mouse-wheel to zoom in and out.
  • Right-click on the canvas or existing nodes to create new types.

Add Entity Type

To add a new AIEntityType you can either click the + button on the top right of each existing node or right-click on the canvas and select ‘Add Entity Type’ from the context menu

Add AIEntityType

Enter a name for your new type. When creating new types the plugin will automatically prefix the name with ‘AIEntityType’ so you don’t need manually enter that.

Select the Parent Type of this type. Remember that if you want a type that controls an intelligent agent then the parent should be AIEntityTypeAgent but if the type just specifies data about a non-intelligent agent that the parent should be AIEntityTypeNonAgent.

You can set a color for this type as a visual quick reference, used mainly on the GameObject Mapping screen, to show which game object is using which type.

Next, you can see the properties that are inherited from the parent types. This cannot be edited.

Finally, you can add/edit/delete the data properties of this type. Click the ‘+ Add New Property’ button to add a property. Each property has a name and a type. Properties can be of type Boolean, Numeric or one of the previously defined AIEntityTypes.

Properties can either be functional or non-functional. Functional properties are single value variables that – as the name suggests – can only have ONE value at a time. A non-functional property is one that can have multiple values at once. E.g. if you need a property such as ‘carrying’ and your NPC can be carrying more than one item at once then that should be a non-functional property.

When deleting a property, the plugin will warn you if that property has been used anywhere within the scene or behaviour setup. If you decide to delete a property that is in use then all references to that property will also be deleted.

Add Entity Type Property

Edit Entity Type

To edit an AI Entity Type, click the pencil icon shown on the type’s top bar. The popup shown has the same functionality as the Add Entity Type popup.

Delete Entity Type

In order to delete an AIEntityType, click the trash-can icon on the type’s top bar. The plugin will not allow you to delete a type that is still in use. You should remove all references to the type in the scene setup and all behaviours before deleting an AIEntityType.

Enum Types

As well as being able to specify data types, you can also specify symbolic types that are very similar to Enums within C# and other programming languages. For example, say you want to represent what the current environment weather is like. You would create an Enum Type called ‘Weather’ and then define values such as ‘Sunny’ and ‘Rainy’ etc.

To create a new Enum type, right-click on the canvas and select the ‘New Enum Type’ menu option.

On the popup you can specify a name for the Enum (e.g. Weather), its color and the values of the type (e.g. ‘Sunny’).

GameObject Mapping

Having created the AIEntityTypes you want to use in your AI Model you next need to assign these types to the Unity Game Objects within your current scene.

GameObject Mapping

C# AIEntity Scripts

Before you get started mapping game-objects to AIEntityTypes you should first create some C# scripts that implement either AIEntity or AIEntityAgent.

AIEntity C# Abstract Class

Derive from this class for each of the AIEntityTypes you defined in the previous step that are non-agents.

You will need to implement the method ‘GetPropertyValue’ in your class but for now this method can just return null until you are ready to implement this more fully. When the AI is fully modelled and trained this method will be called by the plugin to gather the current values for each of the AIEntityType’s properties.

Example AIEntity Derived Class

For example, if you have doors within your scene that agents will interact with, and these doors have a Boolean property such as ‘OpenedState’ then the plugin will call this GetPropertyValue asking for the current value of ‘OpenedState’ on each door object.

AIEntityAgent C# Abstract Class

Derive from this class for each of the AIEntityTypes you defined to be intelligent agents.

As well as the ‘GetPropertyValue’ method you will need to implement the following methods:

ReceiveActionFromAI

ReceiveCommandFromAI

These methods are called by the plugin to tell the agent which action it should currently be performing.

Example AIEntityAgent Derived Class

Mapping a Game Object

Down the left-hand side of the screen there is a list of the game objects within your scene.

Click on any of these object objects to ‘map’ it to AIEntityType.

Add Game Object

If you haven’t yet created any C# scripts derived from AIEntity or AIEntityAgent then you will be presented with the following popup

Add Game Object if no scripts declared

On the Add Game Object popup you specify the AIEntityType that will model this game object. You also specify the C# script that will provide data to and receive actions from the AI Model.

Click the Add button to map this game object to the AIEntityType.

The color of the top bar of each mapped game object is the color of the AIEntityType, so you can quickly see which objects are mapped to each type.

Once added you can position the game objects on the canvas in a way that works for you. Just left-click and drag each mapped game object.

Deleting a Mapping

You can delete a mapping by clicking the trash-can icon on the top of each mapped game object.

A warning will appear if the mapped game object has already been used within any of the behaviours. If you decide to delete the mapped game object, then that will also delete all references to the game object within all behaviours in the Unity project.

Editing a Mapping

If a game object mapping is not currently used in any behaviour then the AIEntityType can be changed using the drop-down list. Once a game object is used within a behaviour then its AIEntityType can no longer be changed. If you need to alter the AIEntityType then you should first remove all references to the game object within all behaviours or delete the game object mapping (which also deletes all references) and then re-add it with the correct type.

You can change which C# script is attached to this game object at any time using the drop-down list.

Attaching Sensors

AIEntityTypes that are children of AIEntityTypeAgent can sense their environment, in order to inform their decision making.

Therefore, any game object that is mapped to an AIEntityType that is a child of AIEntityTypeAgent can have sensors attached to them. To add a sensor just click the ‘Add Sensor’ button and select the sensor to add. Each sensor is added as a Unity Component to the game object.

You can create your own sensors by deriving from the AIBrainSensor C# class.

Relinking a Game Object Mapping

If, for whatever, reason a game object is removed from your Unity scene it will still appear on the GameObject Mapping screen. You can choose to either ‘re-link’ a game object or delete the mapping entirely. To re-link a game object, just select which game object to link it to and you’re done.

Environment Properties

Environment properties provide data to the AI Model about the state of the game environment, such as time-of-day, weather etc.

Environment Properties

Add Environment Property

To add an environment property, just click the ‘+ Add Property’ button. On the resulting popup you just need to specify a name and type for the property. The type can be Boolean, Numeric or one of the defined AIEntityTypes or Enum Types.

Add Environment Property

Edit Environment Property

To edit an environment property, click the pencil icon button next to the property. Once an environment property is used within a behaviour you can no longer change its type. However, you can edit its name at any time.

Delete Environment Property

To delete an environment property, click the trash-can icon button next to the property. If the property is used within any of the behaviours then a warning will appear telling you where the property is current referenced. If you decide to continue with the delete, all of these references will also be deleted.

Environment C# Script

In order to for the game to update the Environment Properties, you need to derived a class from the AIEntityEnvironment class and then manually attach the derived C# script to a relevant Game Object within your scene.

The only method you need to implement is the GetPropertyValue(string propertyName) method.

The plugin will call this method for each Environment Property you have defined in the Plugin AI Scene Setup.

Example Script Derived from AIEntityEnvironment

Behaviours in Scene

Once you have completed the AI Scene Setup you can move onto creating behaviours for your agents.

You are able to define multiple different behaviours within a Unity project, however each agent within a project can only have one behaviour associated with it.

Behaviours in Scene

Navigation

To navigate the canvas you can use the following mouse controls:

  • Left-click and hold to pan around the canvas.
  • Scroll the mouse-wheel to zoom in and out.

Add Behaviour

Click the ‘+ New AI Behaviour’ button to begin adding the behaviour. On the ‘Add New Behaviour’ popup you are able to specify the name of the behaviour and the Agent game objects that this behaviour will control. Just enter a name and check the game objects you wish to be controlled by this behaviour.

Configure Behaviour

Once a behaviour is created you can configure that behaviour with all of the actions, perceptions and neural network training data that makes up each behaviour. Click the ‘Configure’ button in order to start this process.

Delete Behaviour

Click the ‘Delete’ button next to the behaviour in order to delete it.

Duplicate Behaviour

To duplicate an existing behaviour click the ‘Duplicate’ button. On the ‘Duplicate Behaviour’ popup you can specify the name of the new behaviour and also which agent game objects it will control.

View/Edit Agents

You can change which agents will be controlled by each behaviour at any time by clicking the ‘View/Edit Agents’ button. This will allow you to change the agents controlled by the behaviour. If an agent is already controlled by another behaviour you will be shown a warning asking you to confirm that you want to change the controlling behaviour.

Starting Training

Once a behaviour is fully configured and compiled, the ‘Start Training’ button will become available. Clicking this button will send the behaviour configuration to the server and once trained, the DNN data will be delivered back to your Unity project.

AI Modelling

Types & Objects

This screen is where you can decide which game objects, out of all of the ones you mapped on the GameObject Mapping screen, you wish to have in this behaviour. Limiting your behaviour to only be aware of the game objects that it absolutely requires helps to reduce the time it takes to train the behaviour and the CPU demands of the behaviour at runtime.

Types & Objects

Navigation

To navigate the canvas you can use the following mouse controls:

  • Left-click and hold to pan around the canvas.
  • Scroll the mouse-wheel to zoom in and out.

Including / Excluding Game Objects

Simply check the game objects you wish to include in this behaviour. By default, all game objects are included.

Properties

This screen has a dual purpose. Firstly it allows you to define behaviour specific properties and secondly it allows you to decide which properties from the scene setup you wish to include in this behaviour. Similarly to the Types & Objects screen, limiting the behaviour to as few properties as possible decreases training time and CPU demands of the behaviour at runtime.

Properties

Adding Behaviour Properties

Click the ‘+ Add New’ button next to Behaviour Specific Properties text. Similar to adding properties elsewhere in the plugin, you specify a name for the property and its type.

However, behaviour properties can have zero or one parameters. These parameters can only be of AIEntityType type. This makes them useful for representing relationships between AIEntityTypes.

Deleting Behaviour Properties

You can delete any behaviour property by clicking the trash-can icon button next to it. If the behaviour currently references this property a warning will appear showing where the property is being used. You can decide to continue to delete the property, however all references to it in the behaviour will also be deleted.

Including / Excluding Properties

You can include or exclude AIEntityType and Environment properties using the check-boxes shown next to each property. If you attempt to exclude a property that is already in use within this behaviour then a warning will be displayed and if you chose to continue to exclude the property then all references to the property will also be removed

Action Definitions

Defining actions is an incredibly important aspect of creating the behaviour as it tells the AI the range of actions you require this behaviour to take, the conditions under which to perform the action and what happens when the action is performed.

Action Definitions

Each action has the following elements:

  • Action Name – the name of the action
  • Parameters – the parameters of the action. E.g. a move-to action would take a location as a parameter.
  • Precondition – defines the conditions under which this action is to be performed.
  • Effect – defines the effect on the AI Model when this action is performed.

Creating an Action

Click the ‘+ Add Action’ button to add a new action. Simply enter the name of the action you wish to define and click ‘Done’.

Editing an Action

Once an action is created you can change its name by clicking the pencil icon button on the top bar of the action.

Adding Action Parameters

Click the ‘+ Add Parameter’ button to add a new parameter. On the resulting popup enter a name of the parameter and specify the type of the parameter. Action parameters can only be of AIEntityType type; they cannot be Boolean or Numeric.

Editing / Deleting Action Parameters

To edit a parameter, click the pencil icon button. On the resulting popup you can edit the name and/or the type.

To delete a parameter, click the trash-can icon next to it.

Add/Edit Precondition

Action Preconditions are composed of one or more Boolean expressions. I.e. Each of these expressions results

Expression Entry

In order for the precondition to be met ALL of the expressions have to evaluate to true. (A Boolean AND operation).

To add a new expression, just click the ‘+ Add’ button on the Precondition node. This will bring up the ‘Add Action Precondition’ popup that allows you to enter your expression

A Boolean expression can take two forms. The typical case is where you are comparing one value against another, this is denoted by a left-hand-operand, a comparison operator and a right-hand-operand. The other form is where the left-hand side itself evaluates to a Boolean value and therefore the comparison and right-hand-operand are not required.

The expression entry popup therefore allows you to specify the left-hand operand, the comparison operator and the right-hand operand. However, where the left-hand operand evaluates to a Boolean then you don’t need to enter an operator or right-hand operand.

Additionally, you can choose whether you want the negated or non-negated version of the expression – by checking the ‘Negated’ checkbox at the top of the popup.

Specify the left-hand operand by selecting from the dropdown lists the properties you wish to evaluate as part of this expression. Next specify the comparison operator – numeric values have the full range of numeric operators whereas AIEntityType values can only be compared for equality. Finally, specify the right-hand operand using the dropdown lists or, if a numeric value, by entering an immediate value. The properties available to select from for the right-hand operand are filtered to match the type of property selected for the left-hand operand.

If the property is of AIEntityType you can ‘drill-down’ into its properties by clicking the ‘+’ button that appears next to it.

Adding Effects

When an action is performed this will have some form of effect on the properties of the AI Model. For example, a move-to action would change the location of the object to now be at the new location. The Effects section of an action is where these action-effects are defined.

Action Effects can take two forms. The first is a regular effect that always gets applied whenever the action is performed, no matter what the circumstances. The second is of a form where the exact effect made is dependent on other properties or action parameters. For example, our move-to action could have particular effects depending on where we have moved to, our health or mood could be affected by where we have moved to. This second form of effect can be added as a Conditional Effect.

To add a Regular Effect, click the ‘Add Effect’ button on the ‘Action Effects Root’ node. Once added you can specify which properties are updated and to which values in a very similar way to how the precondition expressions are built. The difference this time is that the operator now describes how the property is updated. For numeric properties they can be assigned a value, incremented by a value or decremented by a value.

To add a Conditional Effect, click the ‘Add Condition’ button on the ‘Action Effects Root’ node. This adds two additional nodes – one containing the condition and a second containing the effect. The expression for the condition is added in exactly the same way as the Action Precondition. On the attached effect node, you can specify which properties are updated and to which values in a very similar way to how the precondition expressions are built. The difference this time is that the operator now describes how the property is updated. For numeric properties they can be assigned a value, incremented by a value or decremented by a value.

Initial Beliefs and Goals

This screen is where you can set the initial values of the properties for this behaviour. These are said to be the initial beliefs of the agents controlled by this behaviour.

Initial Beliefs

Adding a New Belief

Initially this list of beliefs is empty. To add a belief, click the ‘+ Add New Belief’ button. Note, there are separate buttons for adding beliefs about behaviour properties, AIEntityType properties and Environment Properties.

Clicking the ‘+ Add New Belief’ button will allow you to select the property that you want to give a value to and also specify its value.

Deleting a Belief

To delete a belief just click the trash-can icon button shown next to the belief.

Adding / Editing Goals of Agents

Each behaviour is seeking achieve a certain goal. For example, that could be to maximize health or wealth of the agent.

Specifying a goal is exactly the same as adding an Action Precondition. Once again all of the individual expressions are Boolean AND’ed together to create the overall goal.

Percepts

Percept Processing involves taking data delivered from the game and using that to update the beliefs of the AI Model, i.e. they involve the AI Model ‘perceiving’ the game world. These percepts could be directly transformed into beliefs, or more interestingly can be ‘filtered’ by the current mood or other state of the agent.

There are 3 types of percept rule:

  • Initial Percepts – these are the percepts delivered from the game when the scene is initialized.
  • Regular Percepts – these are sent at regular intervals from the game to the AI.
  • Action Outcomes – these are percepts about what actually happened when an action was performed. For example, if the AI sent an action to open a door, the action outcome could be whether the door opened or not.

A percept rule has the following elements:

  • Rule Details – name of the rule and what it processes.
  • Rule Precondition – the precondition that has to be satisfied for the percept rule to be executed.
  • Rule Effects – the effects on the AI Model’s beliefs if the percept rule is executed.

Percepts

Add a Percept Rule

On the left-hand panel under the ‘Percept Processing’ heading you first need to select the type of percept rule you wish to add by using the dropdown list. Select from ‘Initial Percepts’, ‘Regular Percepts’ and ‘Action Outcomes’.

Next, click on the ‘+ Add Rule’ button. This results in the ‘Add Processing Rule’ popup appearing.

Add Percept Processing Rule

Here you need to enter a name of the percept rule, the source of the perception (Environment, Self Sensor or an external Sensor) and, if applicable, the AIEntityTypes this rule will be allowed to process.

Once created the rule will appear on the canvas as a series of nodes showing the various elements of the rule.

Editing Percept Details

Once a percept rule is created you can edit its details. To do this click the pencil icon button on the top bar of the Rule Details node. This allows you to edit the name of the rule, the source of the percept and, if applicable, the AIEntityTypes this rule processes.

Additionally, percept rules are executed in order from top to bottom and therefore the order that the rules are defined is important. To promote or demote a rule, use the up and down arrows to move it up/down in the list of rules.

Deleting a Percept Rule

Click the trash-can icon button to delete a percept rule.

Add/Edit Precondition

In exactly the same way as you edit Action Preconditions you can specify a series of expressions that all have to be true for the percept rule to execute its effects.

Add/Edit Effects

Again, largely the same as specifying Action Effects, you can specify how the AI Model’s beliefs are updated when this percept rule is executed.

The difference is that in percept rules, conditional effects have both ‘expression evaluated to true’ and ‘expression evaluated to false’ branches so that you can specify effects that happen depending on how the expression evaluates.

Entering of the expressions is exactly the same as Action Effects.

Variables

If you are constantly referring to the same properties of a particular game object it is sometimes convenient to declare a variable and then use the variable in your precondition and effect expressions.

To declare a variable, click the ‘+ Add Variable’ button and select the property you want the variable to represent.

The name of the variable is automatically set, of the form x1, x2, etc.

Once declared, variables can be referenced in precondition and effect expressions. You can also declare variables that are properties of other pre-existing variables.

Action Mapping

Once you have Actions defined, you can specify how these actions are delivered to the game’s C# AIEntity class methods.

Action Mapping

The left-hand panel shows a list of all actions that have been defined for this behaviour. Clicking on one of these actions shows how it is mapped to a C# action. Initially, each action has no mapping to a C# action.

Add Action Mapping

To add a new mapping click the ‘+ Add C# Action’ button on the bottom of the Action node. Doing this allows you to enter the name of the C# action. This is important as this is the action name that is sent to your C# code.

Each action can have multiple C# actions. These C# actions are delivered to your code sequentially from top to bottom.

Add C# Action Parameter

As well as the name of the action you can specify the data that is delivered to your C# code. This is done by specifying a number of parameters to the C# action.

To add a new parameter, click the ‘+ Add Parameter’ button. This allows you to select a property to send to the C# code as part of this action.

Change Execution of C# Actions

By clicking the up and down arrows on each C# action you can change the order in which the C# actions are sent to your game

Variables

If you are constantly referring to the same properties of a particular game object it is sometimes convenient to declare a variable and then use the variable as a parameter to the C# action.

To declare a variable, click the ‘+ Add Variable’ button and select the property you want the variable to represent.

The name of the variable is automatically set, of the form x1, x2, etc.

Once declared, variables can be referenced in C# action parameters. You can also declare variables that are properties of other pre-existing variables.

AI Training

AI Modelling Completeness

Before you can set up the training configuration, you must have completed the AI Modelling stage.

The criteria for this is that the AI Model must have at least one Action and it must have a valid Goal defined.

Training Data Generator

Deep Neural Networks need training and they require data in order to perform that training. In the fAutonomy Plugin this training data is generated from looping through valid values of properties and their parameters.

This screen is for you to decide which properties will be used to generate the training data.

Training Data Generator

This screen shows you all of the properties in the current behaviour. You need to check the properties that you want to generate training data from. For those properties that have parameters you can generate training data from looping through all valid values of those parameters.

Train on Property Values

To generate training data by looping through all valid values of that property, check the ‘Train on Values’ checkbox.

If the property is a numeric property, you need to indicate the valid range of numeric values for that property. This is specified by a Minimum Value, Maximum Value and a Step (or Interval) value.

For example:

Minimum = 0, Maximum = 100, Step = 10

Generates training values of: 0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100

For all other type of parameters (Boolean and AIEntityType) the plugin already knows the valid range of values of those types, so you don’t need to specify these.

Train on Parameter Values

To generate training data by looping through all valid values of a property’s parameters, check the ‘Train on Parameters’ checkbox.

Since parameters can only be AIEntityType the plugin already knows all of the valid values of parameters, so you don’t need to specify the range of values.

Set Number of Training Points to Generate

At the top-left of the screen there is a slider bar. This is to set the number of training data points to generate. The maximum number of training data points you can set is calculated based on how many properties you are generating training data from and the range of values these properties can take.

Auto-Exec Actions

Here you can set Actions that will be automatically executed by the AI.

To add an Auto-Exec Action, click the ‘+ Add Action’ button. On the resulting popup select the Action you wish to execute and specify any parameters that the action takes. Note that the parameters here have to be a pre-existing variable. Therefore, you should create variables prior to adding the action.

Variables

If you are constantly referring to the same properties of a particular game object it is sometimes convenient to declare a variable and then use the variable as a parameter to the C# action.

To declare a variable, click the ‘+ Add Variable’ button and select the property you want the variable to represent.

The name of the variable is automatically set, of the form x1, x2, etc.

Once declared, variables can be referenced in Auto-Exec Action parameters. You can also declare variables that are properties of other pre-existing variables.

Training Data Encoder

This screen is where you configure how the properties of your AI Model are encoded onto a rectangular image. This image forms the input to the Deep Neural Network.

Training Data Encoder

Configure Image

At the top of the screen you can set the size of the encoder image and how segments the image is divided into. Each segment will be assigned a property that will be encoded into that segment.

Resolution X and Resolution Y specify the pixel dimensions of the image to encode to.

Columns and Rows specifies how to divide the image up into segments.

Navigation

To navigate the canvas you can use the following mouse controls:

  • Left-click and hold to pan around the canvas.
  • Scroll the mouse-wheel to zoom in and out.

Set Property into Image Segment

Right-click on an image segment and select the ‘Set Property’ menu option. On the resulting popup you can select either a property or set the segment to a particular immediate value.

Set Property

To set the segment to be an immediate value, select Immediate Color Value from the top dropdown list. Then input the value, between 0 and 255, as the value in the text box.

To set a property, first select Property Value from the top dropdown list and then select the property and any parameters using the dropdown lists shown.

For numeric properties, you also have to specify how that numeric value is mapped into the 0 – 255 range of the pixel colour. This is done by specifying the min/max values of the property, this range will be directly mapped to the 0 – 255 range of the pixel.

Alternatively, a numeric value can be mapped to a true/false, where true is mapped to 255 and false to 0. To do this specify the min/max range of numeric values that correspond to the true value, all other values of the numeric property will be mapped to false.

Clear Property from Image Segment

Right-click on the image segment and select ‘Clear Cell’.

Variables

If you are constantly referring to the same properties of a particular game object it is sometimes convenient to declare a variable and then use the variable as a parameter to the C# action.

To declare a variable, click the ‘+ Add Variable’ button and select the property you want the variable to represent.

The name of the variable is automatically set, of the form x1, x2, etc.

Once declared, variables can be referenced in the properties you set into each Image Segment either as part of specifying the property or as parameters to properties. You can also declare variables that are properties of other pre-existing variables.

Training Settings

This settings screen allows you to control the training process and also the complexity/structure of the Deep Neural Network.

Training Settings

Solver Settings

You can select preset solver settings by selecting Low, Medium or High settings. For advanced users, you can also select the Custom Settings option to be able to manually specify each setting.

Training Settings

You can select preset training settings by selecting Low, Medium or High settings. For advanced users, you can also select the Custom Settings option to be able to manually specify each setting.

Note that Network Input X Res and Network Input Y Res are set on the Training Data Encoder screen. Additionally, Network Output Size is calculated by the Plugin based on the number of actions and all valid values of those actions’ parameters.

Summary Page

Once you have completed the setup of your AI Model and Training Data it is time to Compile the setup of the configuration files that formally specify the AI.

Summary

Compiling

Click the Compile button and check the ‘Current Compilation Status’ for warnings and any errors. Until you fix any errors you will not be able to send the AI Model to the server for training.

Starting Training

Once compilation has been successful, you can send the configuration files to the server in order that it can train the Deep Neural Network.

Once training is complete the trained DNN will be delivered back to your Unity Project and you can then test your behaviour.

Settings

Preferences

The preferences screen is where you configure your user details, subscription details and the AI Server details.

Preferences

User Details

Here you will enter your username and password that you used when you registered on the fAutonomy Website.

The Subscription ID will have been given to you once registration was successfully completed.

Once you have entered your user details you can click the ‘Verify Details’ in order to check that they are valid.

Subscription Details

When you have valid user details, the status of your subscription will be shown here.

You can click the ‘Manage Subscription’ button to view your subscription details in more detail on the fAutonomy Website.

AI Server Details

These server details are what will be used by the plugin to train the Deep Neural Network

The server details are automatically populated with the default.

Login
Loading...
Sign Up

New membership are not allowed.

Loading...