Game Design using Evolutionary Algorithm
When to Use
Consider using an evolutionary algorithm to assist you in the game design process if the following conditions hold:
-
There is something to optimize. Your game features parameters that can take various values, and you are not sure which value is the best for each of them. These can be any properties of units, weapons, items, maps, settings, etc. The parameters can even be the sequence of levels (from a set of predefined ones) to include in the game.
-
You can perform trials of these parameters in the game relatively fast (e.g., under 5 minutes). This can be achieved, for instance, with the help of bots, turning off the rendering, and increasing the internal game clock, i.e., the rate at which the game is running. Based on these trials, you can devise a score indicating how well a given set of parameters has performed in the game. This procedure is referred to as the fitness function. You can read more about it here: Evolutionary Algorithm: Designing Fitness Function
Individual (Encoding)
In evolutionary computation, the term Individual refers to a member of a population that undergoes evolution. An individual has a genotype, which is the genetic encoding, and a phenotype, which symbolizes observable traits in the environment stemming from the encoding.
To Grail, the encoding will be equivalent to some set of parameters that you want to optimize. These can be, for example:
-
Parameters of units such as health, movement speed, etc.
-
Parameters of items such as cost, attack value, etc.
-
Parameters of maps such as indices of locations of resources, tiles used, etc.
-
And many more
To declare which parameters you want to optimize using the Evolutionary Algorithm (EA), inherit from the Individual
class:
-
C++
-
C#
//Individual.hh
namespace grail::evolution
{
class Individual
{
...
}
}
namespace Grail.Evolution
{
public abstract class Individual : IComparable<Individual>, IComparer<Individual>, IEquatable<Individual>, IFitnessProvider
{
...
}
}
In the created class, define members (fields or properties) of the EvoParam<T>
type.
EvoParams
The EvoParam<T>
is a type for evolutionary-optimizable parameters. The template type T
is a type behind and it defines the type of possible values the particular parameter may store. Do not inherit from it but instead use it directly and provide the type for your parameter.
Because the optimization process is discrete, it is necessary to also set the domain for the parameter. This is done in the constructor of this class.
For example, let’s imagine that you want to create a parameter that encodes the health of an archer unit in your game. Let’s assume that you are not sure how much health it should have but you know it should be somewhere between 50 and 80. This can be achieved using the following construct:
-
C++
-
C#
using grail::evolution::EvoParam;
EvoParam<int> health({ 50, 55, 60, 65, 70, 75, 80 });
EvoParam<int> health = new EvoParam<int>(50, 55, 60, 65, 70, 75, 80);
You can call Value()
on the parameter to get the current value it holds.
The evolutionary algorithm will be changing the values of EvoParams when evolving the Individuals . In a typical case: you do not have to!
However, if you want to know how to change the current value held by EvoParam , there are many methods for it: SetNext() , SetNextClamped() , SetPrev() , SetPrevClamped() , and SetPositionIndex() . Please refer to the API documentation for their descriptions.
|
Please note that any EvoParam
is implicitly convertible to the template/generic type it encapsulates. So you can, for example, do this:
if(health > 21) //you can do this even though health is of type EvoParam<int> { // do something ... }
Back to Individual
To create a functional Individual
, we recommend to follow this workflow:
-
Create a class that inherits from
grail::evolution::Individual
(C++) orGrail.Evolution.Individual
(C#) -
Create members of
EvoParam<T>
type -
Set these members as optimizable in the constructor of your class
-
Override the
Instantiate()
method fromIndividual
Example:
-
C++
-
C#
struct OptimizableArcher : public Individual
{
EvoParam<int> health({ 50, 55, 60, 65, 70, 75, 80 });
EvoParam<AbilityEnum> ability = EvoParam<AbilityEnum>({ AbilityEnum::FLAME_ARROW, AbilityEnum::MULTI_ARROW, AbilityEnum::CLOAK });
OptimizableArcher()
{
SetMemberParameterOptimizable(health);
SetMemberParameterOptimizable(ability);
}
std::unique_ptr<Individual> Instantiate() const override
{
return std::make_unique<OptimizableArcher>();
}
}
public class OptimizableArcher : Individual
{
EvoParam<int> health = new EvoParam<int>( 50, 55, 60, 65, 70, 75, 80 );
EvoParam<AbilityEnum> ability = new EvoParam<AbilityEnum>(AbilityEnum.FLAME_ARROW, AbilityEnum.MULTI_ARROW, AbilityEnum.CLOAK);
OptimizableArcher()
{
SetMemberParameterOptimizable(health);
SetMemberParameterOptimizable(ability);
}
public override Individual Instantiate() => new OptimizableArcher();
}
And Voila! You have created the optimizable space.
Setup of the Algorithm (EAOptimizer)
The interface class for the evolutionary algorithm is EAOptimizer
.
First construct the EAOptimizer
object. The constructor looks as follows:
-
C++
-
C#
// EAOptimizer.hh
// namespace grail::evolution
EAOptimizer(std::vector<std::unique_ptr<EvoScript>>& initialPopulation,
size_t maxEpochCount,
std::unique_ptr<Crossover> crossover = std::make_unique<Crossover>(),
std::unique_ptr<Mutation> mutation = std::make_unique<Mutation>(),
std::unique_ptr<Selection> selection = std::make_unique<Selection>(),
std::mt19937_64::result_type seed = std::random_device{}());
// namespace Grail.Evolution
public EAOptimizer(List<Individual> initialPopulation,
uint maxEpochCount,
Crossover crossover = null,
Mutation mutation = null,
Selection selection = null)
Below, we will comment on the arguments of the constructor:
Initial Population
These are starting individuals, i.e., the ones with starting sets of parameter values.
Unless you have specific knowledge about the optimized problem, it is a good idea to generate a random initial population.
You can do this by calling the GetRandomRealizations(random, count)
method on any Individual
instance. So, you can create an arbitrary Individual
and call this method.
Alternatively, you can prepare the collection of diverse individuals by hand.
Usually, the worst idea is to just clone one Individual
without any randomization. You can do this by calling CloneMany(count)
. If you go this route, please consider setting a high probability of mutation, so the individuals will diverge.
In most project reports and scientific papers, the sizes of the populations varied between 20 and 300. |
MaxEpochCount
This is the maximum number of full iterations after which the algorithm will stop making any further optimizations. In one full iteration, each individual is evaluated and has a chance to be affected by the crossover and mutation operators. The iteration ends with selecting the next population.
Crossover, Mutation, and Selection
The mutation
, crossover
, and selection
objects are optional. If they are not provided, the default ones will be used. The default objects have been prepared by us in such a way that they should work well in most scenarios.
We have dedicated a separate page for details regarding these objects, including how to set their various parameters: Link to: Evolutionary Operators.
The linked page contains more general information about evolutionary operators: Crossover, Mutation, and Selection.
Running the Algorithm
Use a method of EAOptimizer
called RunInteractive()
to either start the evolutionary process or continue it.
This method will run until an evaluation of an Individual
is required or the algorithm comes to a natural end. You provide from 1 to 3 callback functions:
-
void OnEvaluateNeeded(individual) - a typical implementation will apply the current set of parameter values held by the
individual
and set the state of the game to begin the simulation. When the simulation ends, simply set the fitness score to thisindividual
based on the outcome of the simulation:
-
C++
-
C#
void OnEvaluateNeeded(Individual& individualToEvaluate)
{
//cache individualToEvaluate
}
void EvaluationReady()
{
cachedIndividual.SetFitness(score);
}
void OnEvaluateNeeded(Individual individualToEvaluate)
{
//cache individualToEvaluate
}
void EvaluationReady()
{
cachedIndividual.Fitness = score;
}
Once you set the fitness value, you can call RunInteractive()
again and the whole process repeats.
-
void OnNewEpoch() - this callback will be invoked after each whole iteration of the algorithm is completed.
-
void OnFinished() - this callback will be invoked when the
MaxEpochCount
of iterations, provided in the constructor, have passed.
Please note that since you are manually advancing the progress of the evolutionary algorithm by calling the RunInteractive() method, you can decide at any point to terminate the algorithm before the maximum number of epochs is reached. For instance, you can call GetBestIndividual() each time and check whether it already satisfies your needs.
|
Simple Example
The example below presents a complete implementation of a very simple evolutionary algorithm with fitness calculation performed directly based on the structure of the individual (without simulating a game). This rudimentary example serves only as an illustration of how to write a loop for executing the algorithm.
-
C++
-
C#
struct CarEncoding : public Individual
{
EvoParam<double> Acceleration = EvoParam<double>({3.5, 4.0, 5.6, 7.0, 10.0});
EvoParam<double> Handling = EvoParam<double>({0.4, 0.5, 0.6, 0.9, 1.0});
std::vector<EvoParam<double>> WeightsFrontBack = std::vector<EvoParam<double>>{{550.0, 622.0, 744.0},
{320.0, 433.0, 665.0}};
CarEncoding()
{
SetMemberParameterOptimizable(Acceleration);
SetMemberParameterOptimizable(Handling);
SetMemberParameterOptimizable(WeightsFrontBack);
}
std::unique_ptr<Individual> Instantiate() const override
{
return std::make_unique<CarEncoding>();
}
}
class Optimization
{
bool forceTermination = false;
private const int POPULATION_SIZE = 100;
private const int EPOCH_COUNT = 50;
std::unique_ptr<EAOptimizer> optimizer;
public:
void Perform()
{
CarEncoding individual{};
auto population = individual.CloneMany(POPULATION_SIZE);
optimizer = std::make_unique<EAOptimizer>(population, EPOCH_COUNT);
while(forceTermination == false)
{
optimizer.RunInteractive(OnEvaluateNeeded, OnNewEpoch, OnFinished);
}
}
private:
void OnEvaluateNeeded(Individual& individualToEvaluate)
{
auto car = static_cast<CarEncoding&>(individualToEvaluate);
auto totalWeight = std::accumulate(car.WeightsFrontBack.begin(), car.WeightsFrontBack.end(), 0.0);
car.SetFitness(car.Handling*car.Acceleration)/totalWeight);
}
void OnNewEpoch()
{
forceTermination = optimizer.CalculatePopulationDiversity() < 0.2;
}
void OnFinished()
{
forceTermination = true;
}
};
public class CarEncoding : Individual
{
public EvoParam<double> Acceleration = new EvoParam<double>( 3.5, 4.0, 5.6, 7.0, 10.0);
public EvoParam<double> Handling = new EvoParam<double>( 0.4, 0.5, 0.6, 0.9, 1.0);
public List<EvoParam<double>> WeightsFrontBack = new() { new (550.0, 622.0, 744.0),
new (320.0, 433.0, 665.0) };
public CarEncoding()
{
SetMemberParameterOptimizable(Acceleration);
SetMemberParameterOptimizable(Handling);
SetMemberParameterOptimizable(WeightsFrontBack);
}
public override Individual Instantiate() => new CarEncoding();
}
public class Optimization
{
private readonly bool forceTermination = false;
private const int POPULATION_SIZE = 100;
private const int EPOCH_COUNT = 50;
private EAOptimizer optimizer;
public void Perform()
{
optimizer = new (new Individual().CloneMany(POPULATION_SIZE), EPOCH_COUNT);
while(forceTermination == false)
{
optimizer.RunInteractive(OnEvaluateNeeded, OnNewEpoch, OnFinished);
}
}
void OnEvaluateNeeded(Individual individualToEvaluate)
{
CarEncoding car = individualToEvaluate as CarEncoding;
var totalWeight = car.WeightsFrontBack.Sum();
car.Fitness = (car.Handling*car.Acceleration)/totalWeight;
}
void OnNewEpoch()
{
forceTermination = optimizer.CalculatePopulationDiversity() < 0.2;
}
void OnFinished()
{
forceTermination = true;
}
}
Obtaining the Results
Once the algorithm has completed, or at any point really, you can observe the current results.
You can make sure if the algorithm has completed by providing the onFinished callback to RunInteractive() .
|
The following methods of EAOptimizer
are helpful for various tasks:
-
GetBestIndividual() - returns the
Individual
with the highest fitness in the population. If the algorithm has converged, it encodes the best set of parameters. -
GetPopulation() - returns a copy of the current population, sorted by fitness. You might want to use it both to check the results and for the algorithm control flow or diagnostic purposes.
-
CalculatePopulationDiversity() - returns the diversity ratio as a number between [0,1]. The lower the value, the more homogeneous the population is. For example, 0 indicates a population of identical
Individuals
. This method is useful for testing and diagnosing whether the current setup of the algorithm works as intended. -
CalculatePopulationIdenticalFrontCount() - similar to diversity, but instead this method finds the most frequent
Individuals
(which are identical to each other) and returns the number of them. -
CalculatePopulationIdenticalFrontRatio() - is equal to
CalculatePopulationIdenticalFrontCount()
divided by the population size.
In addition, you can also save the state of EAOptimizer
to disk, and it will contain the current population.
Saving and Loading
Saving the progress of the evolutionary algorithm is very simple. It only requires calling the SerializeState(filename)
method on the EAOptimizer
instance. If the provided file already exists, it will be overwritten.
For example, you can do it after each evaluation of an Individual
as saving is a relatively cheap operation compared to running a simulation of the game.
This is useful in many scenarios:
-
To mitigate the risk of power failures, hardware errors, game crashes, and other unexpected shutdowns.
-
When your computing resources are not available continuously.
-
When your game simulations require human players, so you want to save progress after each game and resume whenever a new participant is ready.
-
To have easy access to the current population as it is serialized to disk.
To load the saved progress, simply call the DeserializeState(filename)
method on the EAOptimizer
instance.
Please note that to load progress, you have to first create the instance of EAOptimizer . Its initial population will be overwritten by the serialized one, but it is important that it contains at least one individual with the type matching the type of Individual used in the serialized population. This is due to the fact that the EA creates new Individuals after serialization using the Instantiate() method on the proper type.
|