Please contact Gusz Eiben for further information.
Evolutionary processes depend on many factors, so do evolutionary algorithms: their performance depends on their parameters. There has been much research in the past to find "optimal" settings for EAs, typically considering a limited set of paramters, often being one single parameter only. The aim of this project is to invent and study mechanisms to "optimise" all paramters of an EA on-the-fly, while the EA is running. The project is further divided into three sub projects, each being a full size Masters Thesis project on its own. The division is based on the problem context where the evolutionary algorithm is running: optimisation, modelling (machine learning), and simulation. The three students are required to carry out their own research and to deliver thier own thesis, but in cooperation with the other two.
Optimisation is a core task EAs perform. Many applications (e.g. constraint satisfaction, machine learning) are formulated as optimisation problems in order to make them EA-suited. This project will study EA behaviour on "typical" optimisation problems or fitness landscapes. A test suite is still to be chosen, but it must be either a well selected collection of objective functions from the literature, or a test problem generator. Several mechanisms for on-line parameter control are to be realised and tested on the test suite with the explicite intention to generate generalisable knowlegde. That is we want to discover patterns of EA behaviour that apply to a category of problems, rather than single fitness functions.
In the machine learning context a fundamental issue is feature selection. For classification problems, one has to find a classifier and a subset of the attributes such that the classifier will have small error on unknown examples. Genetic algorithms have been successfully applied to the feature selection problem. A GA for feature selection may use a simple bit string representation where each attribute is represented by one bit, and a 1 and a 0 indicate that it is selected or not. The GA evolves a population of candidate solutions (where a candidate solution represents a subset of attributes) using suitable genetic operators. The fitness of a chromosome is computed by training a classifier (for instance, a k-nearest neighbour classifier, or a neural network) using only the features selected by that chromosome (that is, only those attributes with bit equal to 1) and evaluating the classifier on a test set. The GA parameters are in general chosen after a small number of experiments. In this master project, you will investigate the effect of self-adapting parameters on the performance of the GA.
If speaking of evolutionary algorithms in the context of artificial societies, the parameters and operators of these algorithms have some societal meaning. For example, mutation, crossover and population size each have their counterpart in human society as we know it. Self-calibrating versions of these algorithms somehow have the ability to adjust and tune the parameters approriately as prevailing circumstances demand. In artificial societies, self-calibration may mean that the society has some way of controlling these parameters autonomously without intervention of the person running the experimental society of the person who has designed it. We may, for example, think of a technique to control the size of the population in the society as to prevent it from growing indefinitely. In this master project, you research the effects of self-calibrating parameters in evolving artificial societies.