Bala TD – tower defense

My tower defense game is available for android in the Google Play store at

It’s web version is available here on my site at

Project source is at github:


It was developed in Unity, which is an industry leading game engine.

There are many tutorials, great documentation, and a community on the forums that are all contributing to get you started with unity, to get you up to speed on best practices, and to provide a helping hand when in need.

For a game as complex as this, a prototype is fundamental. I’ll walk you through how to create a basic tower defense prototype.

Let’s start with a new project and an empty scene (by default we only have a camera, and a directional light). Change the intensity of the light to 0.5

Now create a terrain object (right click on empty space in the hierarchy, 3D Object / Terrain), and set some of its parameters:
Set a new material for it (eg. mobile/diffuse, nature/diffuse).
Now paint a simple map with a path in it:
on the terrain component, click paint height, choose a brush , then choose one of these
– set height to 5 and paint the hills with it
– set the height to 5, click flatten, then paint the path with 0 height
Adjust the camera so that it looks on the terrain from up top at an angle.
My terrain looks like this:
To set up the navmesh, click on the Navigation tab in the editor, select Bake, adjust the parameters (I reduced the max slope, and the step height), then click Bake:
You can see the navmesh in the Scene view when the navigation tab is open:

Next we’ll create a unit:
– create an empty GameObject called Unit
– add a Sphere as a child object with the Sphere Collider removed
– add a Nav Mesh Agent component
– add a script component called Unit

public class Unit : MonoBehaviour
	protected Transform goal;
	protected NavMeshAgent agent;

	void Awake()
		agent = GetComponent<NavMeshAgent>();
		goal = GameObject.FindGameObjectWithTag("Goal").transform;

	// Use this for initialization
	void Start ()
		agent.destination = goal.position;
	// Update is called once per frame
	void Update ()

– drag your Unit GameObject to the Assets\Prefabs folder
– move your Unit GameObject to the start of the path on the terrain
– create an empty GameObject called Goal, move it to the end of the path, and add the tag “Goal” so that the Unit’s script finds it inside the Awake function (optionally, add a child object so that the goal is visible)

Now press the play button and if you did everything correctly, the sphere will go through the path you’ve drawn on your terrain.

Next, we’ll create a tower:
– create an empty GameObject, add a cylinder child (right click on it, 3D Object / Cylinder)
– add a sphere collider component, set its radius to something bigger (tower uses this sphere to detect units that are close), and mark its checkbox “Is Trigger”
– add a script component called Tower

using UnityEngine;
using System.Collections;
using System.Collections.Generic;

public class Tower : MonoBehaviour
	public GameObject bulletPrefab;
	public float attackRate = 1.0f;
	public Vector3 bulletSpawnPoint = new Vector3(0, 2.5f, 0);
	public int sampleCount = 10;
	public float bulletInitialSpeedAngle = 20.0f;
	public float bulletInitialSpeedMagnitude = 2.0f;

	protected IList<Unit> unitsClose;
	protected Unit target;
	protected float cooldownRemaining = 0.0f;
	protected IList<Unit> removables;

	void Awake()
		unitsClose = new List<Unit>(20);
		removables = new List<Unit>();
	// Update is called once per frame
	void Update ()
		if(cooldownRemaining > 0.0f)
			cooldownRemaining -= Time.deltaTime;


	void RemoveInvalidTargets()
		for (int i = 0; i < unitsClose.Count; i++)
			if (unitsClose[i] == null || unitsClose[i].Dead)
		for (int i = 0; i < removables.Count; i++)

	Unit FindTarget()
		if(target != null && unitsClose.Contains(target))
			return target;

		float minDistSq = -1.0f;
		Unit minUnit = null;

		int count = unitsClose.Count;
		int loopCount = count < sampleCount ? count : sampleCount;
		for(int i = 0; i < loopCount; i++)
			var index = (count < sampleCount) ? i : Random.Range(0, count);
			var unit = unitsClose[index];

			float distSq = (unit.transform.position - transform.position).sqrMagnitude;
			if (minUnit == null || distSq < minDistSq)
				minUnit = unit;
				minDistSq = distSq;
		return minUnit;

	void Fire()
		target = FindTarget();
		if(target == null)

		var go = Instantiate(bulletPrefab, transform.position + bulletSpawnPoint, Quaternion.identity) as GameObject;
		Bullet b = go.GetComponent<Bullet>();
		b.init(target, bulletInitialSpeedAngle, bulletInitialSpeedMagnitude);

		cooldownRemaining = 1.0f / attackRate;

	void OnTriggerEnter(Collider other)
		var unit = other.GetComponent<Unit>();
		if (unit == null)


	void OnTriggerExit(Collider other)
		var unit = other.GetComponent<Unit>();
		if(unit == null)


– drag your tower GameObject to the Prefabs folder

We also need a bullet:
– create an empty GameObject, add a sphere child (right click on it, 3D Object / Sphere), remove the collider component from it
– add a script component called Bullet:

using UnityEngine;
using System.Collections;

public class Bullet : MonoBehaviour
	public float baseDamage = 1.0f;
	public float Speed = 20.0f;
	public float Acceleration = 100.0f;
	// for estimating future position ahead by Tau time
	public float Tau = 0.5f;
	public float ReachingDistanceSquared = 1.0f;

	protected Unit target;
	protected Vector3 velocity;
	protected Vector3 lastTargetPosition;
	// Update is called once per frame
	void Update ()
		Vector3 targetPos = target ? target.transform.position : lastTargetPosition;
		lastTargetPosition = targetPos;

		Vector3 toTarget = targetPos - transform.position;

		if (toTarget.sqrMagnitude < ReachingDistanceSquared)

		if (target && !target.Dead)
			var targetAgent = target.GetComponent<NavMeshAgent>();
			toTarget += targetAgent.desiredVelocity * Tau;

		Vector3 dir = toTarget.normalized;
		Vector3 force = dir * Speed - velocity;
		force = Vector3.ClampMagnitude(force, Acceleration);

		velocity += force * Time.deltaTime;
		transform.position += velocity * Time.deltaTime;

	public void init(Unit target, float angle, float magn)
	{ = target;
		lastTargetPosition = target.transform.position;

		setInitialVelocity(angle, magn);

	void setInitialVelocity(float angle, float magn)
		Vector3 toTarget = (target.transform.position - transform.position);
		toTarget.y = 0;

		Vector3 rotAxis = new Vector3(-toTarget.z, 0, toTarget.x);
		toTarget = Quaternion.AngleAxis(angle, rotAxis) * toTarget;
		velocity = toTarget * magn;

	void ReachedTarget()
		if (target && !target.Dead)


Checking if target is reached is done manually in code with the boolean expression: (toTarget.sqrMagnitude < ReachingDistanceSquared)
You could use a trigger collider, but the problem is, when another bullet kills your bullet’s target, the bullet won’t be destroyed when reaching the target position, which is why I used the simple boolean expression above to check if target is reached.
SetInitialVelocity just makes sets the bullet’s initial velocity at an angle and magnitude, but it might not be necessary. The important tweakable values are Speed, Acceleration, and Tau. The bullet’s movement is essentially a Pursue steering behavior (check my previous post: steering behaviors), we use the agent’s desired velocity and tau to estimate a future position, which is then seeked by applying a force whose direction is the difference between desired velocity (dir * Speed), and current velocity. Other methods could be used for projetile movement, like shooting off the bullet with the simple initial velocity, using only gravity, and not correcting its velocity while it moves. But I wanted to create a bullet that always finds its target.

– drag your bullet GameObject to the Prefabs folder
– drag your bullet prefab from the Prefabs folder to the tower prefab’s “Bullet Prefab” variable in the editor

Unit has to be modified:
– add a Rigidbody component with “Is Kinematic” on (because its movement is not controlled by physics, but instead by the Nav Mesh Agent component)
– add a Sphere collider
– modify its script

using UnityEngine;
using System.Collections;

public class Unit : MonoBehaviour
	public float maxHealth = 4.0f;
	public float Health { get; protected set; }

	protected Transform goal;
	protected NavMeshAgent agent;

	public bool Dead { get; protected set; }

	void Awake()
		Dead = false;
		Health = maxHealth;

		agent = GetComponent<NavMeshAgent>();
		goal = GameObject.FindGameObjectWithTag("Goal").transform;

	// Use this for initialization
	void Start ()
		agent.destination = goal.position;
	public void TakeDamage(float damage)
		Health -= damage;
		if (Health <= 0.0f)
			Health = 0.0f;

	void Killed()
		if (Dead)
		Dead = true;

		agent.obstacleAvoidanceType = ObstacleAvoidanceType.NoObstacleAvoidance;

– apply the changes to your prefab

Now you can create instances from your prefabs in your scene, and watch how the units try to go through the path, while the towers shoot them down.

I admit this prototype might be a long way from a finished product, but it’s a proof of concept, and it’s often useful to make prototypes like these and experiment with them at the start to get a feel for your game, what’s good or bad about it, and how it can be improved.

For easier access, I put the prototype up on github here:

User Interface
At last, here’s some insight about the finished game’s user interface.
Undoubtedly, the UI can be a huge spaghetti code if we’re not careful. I admit, my UI in the end is still somewhat hardcoded, and not very readable, but using a few tricks helped mitigate the problem.

For starters, create a folder for your GUI scripts, and put every GUI script here so that they don’t get mixed with the other scripts.

Create a GuiHandler script:
-store the GameObject references that have some UI panel/context to show (eg. main menu, high scores, game over screen, pause menu, HUD)
-function to hide all these panels – call this whenever a context change requires it, then show the panels you want
-public functions that handle what should happen when a specific button is clicked (like a NewGame function for when the user clicks the “New Game” button)

On the UI Button:
-use UI Button’s On Click() event and call the public function you defined in GuiHandler
-set the navigation properties so that the arrow keys will select the desired UI element (I used the explicit mode)

On the Canvas Scaler (this component should be somewhere in your root GUI GameObject next to a Canvas component)
-set “UI Scale Mode” to “Constant Physical Size” so that when you build for android, the size of your UI elements are only dependent on the phone/tablet’s physical size, and not its resolution (of course this could make them too small for an android TV, but a TV is a whole other story anyway)
-before you’re building to HTML5 or a desktop platform (PC/Mac) then change this mode temporarily to “Scale With Screen Size” and set reference resolution so that your UI looks big enough

Design your GUI scripts so that if you removed them, the project still compiles (Gameplay objects should know virtually nothing of the GUI : This sentence saved me a ton of headache. How then you might ask does a GameObject notify the GUI when it needs to change? The answer is Observer pattern. I used a rather simplistic version of it for my game, but I found it sufficient:

public class MyObservable
	public delegate void MyVoidDelegate();
	public MyVoidDelegate notify;
	public void notifyObservers()
		if(notify != null)

I used instances of this class in Gameplay classes (favor composition over inheritance), and whenever something changed the state of the object, I notified any GUI object that might be listening for changes.

public class SomeGameplayClass : MonoBehavior
	public MyObservable observable {get; private set;}

	void Awake()
		observable = new MyObservable();

	void FunctionThatChangesState()
		//... function body
public class SomeGUIClass : MonoBehavior
	void Start()
		SomeGameplayClass object = ...; //get reference somehow
		object.observable.notify += SomeGameplayClassChanged;

	public void SomeGameplayClassChanged()
		//... update GUI

I said that gameplay objects shouldn’t know anything of the GUI, but the reverse direction is important too: the GUI can (and probably has to) depend on your gameplay classes. Of course you could create interfaces for all your gameplay classes so that the GUI needn’t know anything about your gameplay implementations, but you might end up with a lot of interfaces and that adds to the complexity of your design.

For simplicity my GUI objects retrieved the whole state of the gameplay object they were listening on, and refreshed everything, since my gameplay objects did not have many state variables. If they did, it would have been wiser to replace the void delegate type in MyObservable to a delegate type that takes an event parameter from which a GUI object could determine what parts (state) it had to query, and update.

That’s all I had in mind about my tower defense game, feel free to try it out, tweak values or modify it in any way you want (links are at the top).

For those of you yet unfamiliar with Unity, give it a go, you won’t be disappointed.

Bala Piano

My piano application is finally done and available at the google play store for android phones/tablets:

A web version is up on my website at

Feel free to play around with it, in the middle there are the regular keys (you can slide on the mini keyboard on top to change what range of the piano is visible), and for the chord mode you can press the notes on the bottom and chord variations on the right side. For the famous four chord songs, you can try clicking on C, G, A, F (with pow mode or if you are fast enough: C maj, G maj, A min, F maj).

The full source code can be found on github:


It was developed in Android Studio 2.2 preview 3 using the libGDX java game development framework:

For people unfamiliar with it, libgdx is a quite basic library, small, but useful. It has most of the usual things needed for games:

  • lifecycle management: create, pause, resume, dispose, (resize)
  • game loop: render, Stage.act (for simulating actors in the Stage)
  • hierarchy of objects: base class is Actor, a Group is a container class for Actors, a Stage handles events and has a root Group object that has all the Actors in a hierarchy.
  • textures/sprites: libGDX can be a good choice for simple 2d games, for this purpose it handles textures, sprites, and animation, although it can be used for basic 3d as well
  • math operations: functions like clamp, and trigonometric functions (sin/cos)
  • file handling: A simple text file can be read like this:
    FileHandle file =&amp;amp;nbsp;Gdx.files.internal("file.txt");;
  • sounds: loading and playing back mp3 files
    FileHandle file = Gdx.files.internal("something.mp3");
    Sound s =;;

User Interface:

When I was using an older design of my buttons I noticed how unfairly small the top of the D key was compared to the E, and I started wondering what the actual sizes were.
After some googling I got to John Savard’s page on what the sizes for a piano keyboard should be at and that was my cue when I designed my new layout. According to his post the black keys are not at all in the center of the white keys, and I immediately checked on the real piano to see he was right. A surprising number of apps got this detail wrong (including me at the first try).

Some of the defining aspects of the piano key sizes are:

  • the top of all the keys should have the same width
  • the bottom of the white keys should have the same width
  • symmetric reasons

Already the first and the second is impossible to be true since on the top there are 12 keys and on the bottom there are only 7, so from C to E there are 5 semitones and 3 whole tones on the bottom which means the bottom is 5/3 = 1.666.. wide, but the right keys (from F to B) dictate that this same value should be 7/4 = 1.75. In the end I used the spacing on the bottom left of his page which does set almost the same width for keys that should be, and has good symmetric dimensions, but only uses whole millimetres.

It did took some time for me to familiarize myself with how to put together the UI I want in libGDX, it would have been better if I could do that through an editor (maybe I should have used, but thankfully I found the debug() method inside the source of Actor which makes colored lines appear around my objects’ borders that helped a great deal.

Programming and midi:

What I really liked in libGDX was that the different platforms were in separate folders, and platform specific code could be written there, and used in the core project through interfacing. The same can be found on the wiki:
Let me show you how this is done:

For my app, I needed to play piano notes according to the pressed buttons, thus I needed to have a structure to send when a piano key is pressed/released:

public class NoteEvent {

	public Type type;
	public int pitch;
	public int channel;

	public NoteEvent(Type type, int pitch, int channel) {
		this.type = type;
		this.pitch = pitch; = channel;
	public enum Type {

These NoteEvents are then passed through the following interface to the platform specific implementations:

public interface SoundPlayer {
	void processNoteEvent(NoteEvent ne);

I decided to go this way because at first I tried to use real recorded sounds (mp3) for playing back piano notes, but that changed later. I only downloaded a couple C notes, and for notes in between, I played back the existing C sounds with faster/slower playback rates to distort their pitch.
The pitch distortion playback rate can be given by

float rate = Math.pow(2.0f, pitchTransform / 12.0f);

given that pitchTransform is the relative difference in semitones between the tone you wish to play back and the tone you have the mp3 sound for. (For eg. you have a sound for E3, and you want to play an A3 sound, then pitchTransform is 5 – see the picture at the top).

This is because there are 12 semitones in an octave, a note one octave higher is 2x the frequency, and tones are on a geometric scale. That means that the quotient is 2^(1/12). What follows is that n semitone of difference is a frequency (playback rate) multiplication of 2^(n/12).

I implemented this in RealSoundPlayer, but the problem was that it didn’t work on my phone, yet on another phone it worked fine. This kind of distortion comes at another cost too: when you are playing notes close to each other that are using different actual mp3 files, that can be noticed.
I figured out that by not having any distortion at all fixed these problems but this meant that I had to have sound files for every piano key I wanted to have. Fortunately on I found the files I needed, after some editing and converting they sounded good, and they didn’t take up much space (~ 1.5 MB).

Now what was bothering me, was that there were apps in the play store using MIDI to play back notes, and I really liked the idea of MIDI because when you released a note, the sound faded out nicely, and when you hold on a note, the sound was longer too, this was of course difficult/impossible to do with mp3 files, which were played as one shot sounds without any effect on when you released the button.

This is when the interface came into play (this was actually when I made the interface, up until this point I didn’t need it).


Desktop Midi:

I created a DesktopMidiPlayer that played notes using the javax.sound.midi library. The good thing about it is that it’s surprisingly easy to use. Here’s how you can create midi sounds on desktop java:


	Synthesizer syn;
	MidiChannel[] mc;

	try {
		syn = MidiSystem.getSynthesizer();;
	} catch (MidiUnavailableException e) {

	mc = syn.getChannels();
	Instrument[] instr = syn.getDefaultSoundbank().getInstruments();

The synthesizer can be used to get the channels, and to load instruments. In the initialization I’m just getting the synthesizer, open it, get the channels, and then load the first instrument from the instruments available, which will be a piano.

After the initialization, making a middle C sound (pitch = 60, pitch can go from 0 to 127) on channel 0 (0-15) with a 64 of velocity value (0-127) is one simple line:

mc[0].noteOn(60, 64);

Silencing a note is similar:


I only needed to add a couple extra lines and modifications to make my SoundPlayer implementation able to play any pitch I needed. When it was done, I just had to go into the desktop part of the project, and I passed a new DesktopMidiPlayer() reference to my core libgdx application through the constructor, and from then, I could use it the same as any other SoundPlayer.


Android Midi:

The more difficult part was Android. I don’t know why, but android does not support the javax.sound.midi library. After some search, I found out that android supports midi in marshmallow, but my phone, and I guess a lot of other people’s phones do not support marshmallow. There had to be another way, and so I found a github project that accesses a midi driver through jni (c++), and it also has java bindings :

I did not know how to use the project, I saw that I can send byte arrays, but I didn’t know what the magic numbers were in the example Activity. Weirdly it was the new marshmallow API that shed light on it ( :

byte[] buffer = new byte[32];
int numBytes = 0;
int channel = 3; // MIDI channels 1-16 are encoded as 0-15.
buffer[numBytes++] = (byte)(0x90 + (channel - 1)); // note on
buffer[numBytes++] = (byte)60; // pitch is middle C
buffer[numBytes++] = (byte)127; // max velocity
int offset = 0;
// post is non-blocking
inputPort.send(buffer, offset, numBytes);

So as it turns out the content of a message is first being the type of message, and the parameters follow for the first byte.
For a note on / off message the message type is 0x90/0x80 + channel, 2 more bytes are the pitch (middle C = 60 same as in javax.sound.midi), and the velocity with which the note is pressed/released so 3 bytes take up these messages.
For an instrument changing message only the instrument’s byte identifier is needed as parameter, so it’s only 2 bytes.
Changing the instrument to piano was then done using this line (using billthefarmer’s sendMidi method which creates the byte array and sends it through jni)

sendMidi(MidiConstants.PROGRAM_CHANGE, GeneralMidiConstants.ACOUSTIC_GRAND_PIANO);

(Where MidiConstants.PROGRAM_CHANGE = 0xC0, and GeneralMidiConstants.ACOUSTIC_GRAND_PIANO = 0, other instruments are listed in the GeneralMidiConstants class)

After implementing my AndroidMidiPlayer, all that was left was to pass a new AndroidMidiPlayer() reference in the AndroidLauncher class to my constructor and the core project could use it without any troubles.

After all this, and generating the signed apk (which was only about 4 MB), the play console said my apk was not zipaligned (which means the zipalign tool that’s supposed to run automatically from Android Studio somehow failed to run successfully). After locating the zipalign tool, I had to zipalign my apk manually, and I had to run zipalign on the output of the first zipalign, and then the apk was accepted (probably a bug in the android sdk).

What’s missing from this app is the midi player for the html version (gwt). I use the RealSoundPlayer on the html version, because I did not find any simple web midi library with which I could synthesize piano notes. The web midi api is a relatively new standard for web midi access, but I think it’s meant to be used with an external music device (eg. a digital piano), and not for actually outputting any piano sound using your computer’s hardware.

In conclusion libGDX is certainly a viable option for simple interactive apps/games especially if you need some platform dependent code/library (like midi in my case), although for more complex ones I definitely think Unity or Unreal Engine is the way to go.

Steering behaviors

I’m going to implement basic steering behaviors using unreal engine, but these behaviors shouldn’t be too hard to implement in any environment. I will stick to 2d movement, but these behaviors can be used with a few modifications for 3d movement as well.

Steering is the process of driving a vehicle like autonomous character (boid) with steering forces. The boid is just a simple extruded triangle for now, which has a velocity (vector 2d) with a maximum magnitude, and you can change it’s velocity by applying a force to it. The force (a) is passed down to change the velocity. (v += a * dt), then the position can be modified according to velocity (pos += v * dt).

Simple vehicle model:

– For simplicity, the boid’s velocity (simple_velocity) and the applied force is truncated to be in a unit circle (max magnitude is 1), and the boid has a velocity_scale float variable so that the actual velocity can be retrieved by multiplying the simple_velocity with the velocity_scale.

– The force’s effect is symmetrical.

– The boid is always facing its velocity. This way the boid can turn around quickly when its velocity is low which might be unwanted behavior.

More realistic behavior can be implemented by using a more complex model:

– The force can be applied asymmetrically where braking, acceleration and lateral steering forces have different strengths

– For the quick turnaround: limit the lateral force at low speeds, or separate the velocity by  having a forward speed, and a rotational speed (like in UE4 flying template)

So let’s see that in unreal engine:

After creating a new blueprint project (from flying template) I have made an abstract class for the boid, but that can’t be done in blueprints, so it’s in code. You can add code to the project from within the editor by choosing File/Add Code to Project…  then you can choose the parent class, and the name of your class (I chose DefaultPawn with the name Boid, in code an ‘A’ gets inserted before Boid because Pawn is a subclass of Actor, and the ‘A’ is a sign that it is an actor subclass). After doing that, the editor creates your files for you with some content, and you can edit it in Visual Studio:

Header file:

//includes ...


class BOIDS_API ABoid: public ADefaultPawn




UFUNCTION(BlueprintNativeEvent, BlueprintCallable, Category = &quot;Steering&quot;)
void ApplySteeringForce(const FVector2D&amp;amp; Force);

UFUNCTION(BlueprintNativeEvent, BlueprintCallable, Category = &quot;Steering&quot;)
void UpdateTransform();

UFUNCTION(BlueprintNativeEvent, BlueprintCallable, Category = &quot;Steering&quot;)
FVector2D GetEstimatedFuturePosition(float DeltaSeconds);

UFUNCTION(BlueprintNativeEvent, BlueprintCallable, Category = &quot;Steering&quot;)
FVector2D Seek(const FVector2D&amp;amp; Target);

UFUNCTION(BlueprintNativeEvent, BlueprintCallable, Category = &quot;Steering&quot;)
FVector2D Pursue(ABoid* boid);


virtual FVector2D Seek_Implementation(const FVector2D&amp;amp; Target);


Note the abstract parameter in UCLASS(): this prevents from drag & dropping this object from the content browser to the level.

BlueprintNativeEvent means you provide a c++ implementation but you can override it in blueprints. BlueprintCallable means you can call these functions from blueprints. Category is just to help organize your functions. In my case these functions will be seen from the editor under Steering.

ApplySteeringForce sets the internal speed (or rotational) variables. UpdateTransform updates the boid’s position and orientation based on those speed variables. GetEstimatedFuturePosition is a simple guess for where the boid is going to be after some time (this can be used in pursue, and evade).

You can see in Seek’s declaration, that it has a simple 2D target position and returns a force which points in the target’s direction. The exact same function declaration can be used for Flee and Arrival.

Pursue is different because it takes the target’s estimated future position into calculation so it takes a boid pointer as a parameter.

Since I declared these methods BlueprintNativeEvents, they must be implemented in C++, and can be overriden in a blueprint subclass. This means you have to provide a virtual <functionName here>_Implementation(<parameters here..>) for all these functions. You can see Seek’s declaration above.

CPP file:


FVector2D ABoid::Seek_Implementation(const FVector2D&amp;amp; Target)
return FVector2D();


Seek’s definition can be seen above. Since I intend to override this in a blueprint, I just return a zero vector.

After the code is done, and compiled, you can create a blueprint subclass of Boid in the editor (right click in content browser, Blueprint Class, choose your superclass).

I set up the collision sphere with locked axis Z, and the following collision matrix:



This way, the boids will not collide with each other, but they will collide with the walls in the level.

I chose Shape_Wedge_A for the static mesh component to have an extruded triangle like shape, but you could choose anything you want. I added an arrow to know which way is forward. I set collision to the no collision preset for everything that can collide except the collision sphere.



Now let’s implement the basic steering behaviors:

In the editor under functions you can click on override and select the method you need to override (except for void methods, because they appear as events – you can create logic for those in the event graph)

The motor of these behaviors lie in ApplySteeringForce, and UpdateTransform:



ApplySteeringForce is responsible for setting speed variables according to a force (and the time passed). Whatever force gets in, it get’s truncated to a vector size of 1, then the force affects the velocity (v += a * dt), and it gets truncated too. As a reminder: the simple_velocity refers to the velocity of the boid with a max. magnitude of 1, and that will be scaled with velocity_scale to get the actual velocity.



UpdateTransform sets the position (pos += v * dt), and orientation (the boid faces its velocity’s direction).

GetEstimatedFuturePosition estimates the future position with the exact same expression (future pos = pos + v * dt).


Seek steers in the direction of a target point. It calculates the desired velocity for the target point, and the force’s direction = desired velocity – current velocity


The bottom boid has a velocity facing left, the desired velocity is up and left, so the resulting force is up.



The way this works is if the boid’s velocity is zero, and the desired velocity is (1, 0), then the force will be (1, 0), so the boid gets full throttle forward. When the velocity is backwards (-1, 0), then the force is (2, 0), but it will get truncated along the way in ApplySteeringForce.

When the target is reached, the boid will go over the target, because it can’t stop immediately, so it will turn backwards then, and it will have an oscillating movement around the target at the end.


Flee is the same as seek, except the desired velocity is the opposite.





Pursue uses GetEstimatedFuturePosition’s position as a target, and seeks that. On the picture above you can see an example of 8 boids pursuing each other.


Evade is the same as pursue but flees from the future position instead of seeking.


Wander is a random movement. It uses a circle in front of the boid and there is a point on the circle which can move a little bit every frame, so the wander direction can’t change too fast which makes this ideal for a wandering movement. The point defines the direction of the steering force.

I split the image in two parts



First part is about updating the wander angle (this is in degrees): wander angle = (wander angle + 1500 * dt * RandomFloatInRange(-1, 1)) % 360

1500 is just a speed parameter for the change of wander angle which you have to tweak for your self to get decent results.


Second part is calculating the point for a given wander angle:

Point relative position = forward * radius * sqrt(2) + Rotate Vector Around (Vector(1, 0, 0), wander angle, Vector(0, 0, 1)) * radius

Keep in mind that the actual circle radius (100 in the picture) does not matter this way, since the same steering force will be applied with any radius. But it will matter once the center of the circle is translated forward or backwards or even left/right to make an agent tending to steer more frequently left/right, or if the point’s relative position is not normalized.



Arrival is a lot like seek, but instead of constantly oscillating around the target, it gradually slows down after reaching a distance from the target which makes for a nice smooth movement.

Consider the vector from us to the target: ToTarget = Target – position

Until the ToTarget Vector’s length is bigger than the slowing_distance the desired speed is the same as in seek. Once it is smaller, the desired speed get’s slowed down linearly. I slowed it down quadratically because my boid seemed to overshoot the target a little, and I didn’t want to increase the slowing distance. I made the desired velocity zero, once the target was really close (5 for now) and it would be unnecessary to continue moving towards it.

Slowing Radius = 800

ToTarget = Target – position

Distance = ToTarget.length()

Desired velocity = Distance > 5 ? ToTarget.normalized() : Vector(0,0))

if ( Distance < Slowing Radius ) Desired velocity *= Distance^2 / Slowing Radius^2

Force = Desired velocity – simple_velocity //same as always


Before we go further to the following behaviors, we have need the boids to know which other boids are near it (sensing). In unreal engine you can do that this in an aicontroller. Let’s create an AIController for the boid if you haven’t already done so : I called it BoidAIController (with AIController as the parent class). Add a Pawn Sensing component to it, and set its parameters.



The important parameters for our purpose are only sense players (should be false), sight radius, peripheral vision angle and sensing interval. After it’s done, add an array for holding the pawns (Local Pawns), and override the OnSeePawn event: (Should_update_pawns is a boolean variable for checking whether or not the array should be passed to the controlled boid on the next tick event)



This should add any pawn sensed by the component to the array. The problem is, there is no OnLostSightOfPawn event (yet), so I’m going to remove any pawn from the array that shouldn’t be there. I have made a custom event for this which get’s called every once in a while (0.1 sec)


The graph might look a little chaotic, because I had to squeeze it together to fit, but it’s quite simple. Losing a pawn out of our sight (or sense) can mean one or more of the following:

  • Pawn is further than the sight radius
  • Pawn is behind us where we can’t see it
  • Pawn can’t be seen – meaning there is probably an obstacle (eg. a wall) blocking our view

Now only the tick event remains:



(The Update Neighbour Boids function is actually one of the public methods in the ABoid abstract class – in your implementation you can just store the boid array parameter)

When this is done, we can move on for more sophisticated behaviors.


This behavior tries to stay away from the boids that are close (I’ll call them local boids). This is achieved by the sum of repulsive forces that point from the local boids to our position which are normalized, and weighted by 1 / distance (or something similar, like 1 / (distance + 1) to avoid division by 0). The resulting sum of force can be then scaled or normalized.



A boid needs to know the neighboring boids’ velocities for the alignment behavior. If you haven’t already created a way to get the velocity of a boid, now is as good time as any, you will need it not only for alignment. There are several ways:

  • public function in ABoid abstract class within which you can return current_velocity
  • just cast the boid to the blueprint class to be able to reach your variables
  • use the built-in movement component’s velocity
  • create your own movement component

I chose the third option. The current_velocity variable can be replaced by the movement component’s velocity variable. By doing that the Update Transform event’s translation part becomes useless, because the movement component will handle the translation for you, (I noticed this when the boid became faster after setting the velocity of the movement component) so you have to take out that part, and then you can replace current_velocity everywhere with the movement component’s velocity. This way you not only eliminate the need for a cast, but you improve your component’s cooperation with other components/classes. Your boid’s velocity will be available through AActor’s getVelocity function. The bad part of it is, you don’t really know much about how the movement component uses this velocity variable, so for custom velocity handling you can just stick to the other options, though for bigger projects the best bet is to create your own custom pawn movement component.

If you’re going with the third or fourth option, don’t forget to replace all occurrence of the current_velocity variable with either the getVelocity function, or the movement component’s velocity variable.



Alignment means a boid tries to steer so it faces the same direction as the neighboring boids. If you can reach the velocities, the desired velocity will be the average of the neighboring boids’ velocities. (Alternatively you can average the forward direction instead of velocities.) The force direction is same as usual: desired velocity – velocity


Cohesion is about the boids getting close to each other. The target is the average position of the nearby boids, then you can seek towards that target.



Combined steering behaviors:

You can combine these to achieve a higher level steering behavior, like flocking:


Combining separation, alignment, and cohesion (I added wander too) as a weighted sum (in other words, a linear combination) results in flocking. After tweaking the weights, I’ve had a decent looking flocking behavior



Related documentation:

You can learn a great deal more about steering behaviors from Craig Reynolds:

For me this javascript based tutorial helped a lot too:–gamedev-12732

Unreal engine AI pawn sensing: