The theory of evolution has helped biologists and ecologists to understand more about the natural world in general, and psychologists and social scientists to understand more about ourselves in particular. However, it has also taught a thing or two to computer scientists.

### Hard problems

In computer science, there are many problems that are known to be “hard”. What this means is that there is no efficient method to solve such problems exactly. In practice, these problems can only be solved approximately, and there is no guarantee that such an approximate solution is even close to the (unknown) optimal solution.

An example of such a hard problem is the *subset sum problem*. For this problem you are given a list of *N* values (*a*_{1},*a*_{2},…,*a _{N}*), and a target value

*T*. The question now is whether there is a

*subset*of these

*N*values of which the sum is exactly equal to

*T*. For example, for the list of

*N*=5 values (5,1,10,8,2) and a target value

*T*=17, the answer is “yes”, since the sum of the subset (

*a*

_{1},

*a*

_{3},

*a*

_{5}) is 5+10+2=17. However, for a target value

*T*=22 the answer is “no”.

Of course, this example is easy to solve, as the list of values is short enough that you can try out all possible subsets and see if there is at least one that adds up to the target value *T*. Since each of the five values is either in a given subset or not, there are 2x2x2x2x2=2^{5}=32 possible subsets. Calculating the sum of each of these 32 subsets quickly shows that there is exactly one that adds up to 17 (the subset given above), but none that add up to 22 (we’ll leave this as an exercise for the motivated reader).

However, what if we have *N*=100 values? Now there are a total of 2^{100}≈1.27×10^{30} possible subsets. Given that the age of the universe is estimated to be about 4.4×10^{17} seconds, even if we could calculate the sum of thousands of subsets each second, the lifetime of the universe would still not have been nearly enough to evaluate *all* possible subsets! Yet, hard problems like subset sum, which has practical applications in data security and cryptography, are something that computer scientists have to deal with on a daily basis.

### Evolving solutions

Consider an optimization version of the subset sum problem where, rather than a “yes” or “no” answer, we want to find a subset of which the sum is *as close as possible* to the target value *T*, but not larger. In the example given above, for the target value *T*=22, the closest possible subset sum not larger than *T* would be 21, given by the subset (*a*_{2},*a*_{3},*a*_{4},*a*_{5}). This optimization variant of the subset sum problem is, in general, still hard for large *N*, as there is no efficient way to find the optimal solution. However, we can use a clever computer algorithm to *evolve* an approximate solution, using ideas from real biological evolution.

**Genetic representation**

First, we need a convenient way to represent candidate solutions (i.e., subsets) to the given problem. In our example, we can use bit strings as a representation. A *bit string* is simply a string of 0s and 1s, for example 10101. Such a bit string can be “translated” into a possible subset as follows. If the bit in position *i* is 1 then the value *a _{i}* is in the subset, otherwise it is not. So, the bit string 10101 represents the subset (

*a*

_{1},

*a*

_{3},

*a*

_{5}). This way, there is a one-to-one correspondence between all possible bit strings and all possible subsets. This bit string representation can be interpreted as a

*genotype*, with the corresponding subset as the

*phenotype*.

**Fitness function**

Next, we need a *fitness function* that assigns a *fitness value* to a given bit string (genotype) indicating how well the candidate solution (phenotype) represented by that bit string solves the given problem. In our example, this fitness value is simply the subset sum, or 0 if that sum is larger than the target value *T*. So, the bit string 10101 will have a fitness value of 17 (the sum of the subset it represents), whereas the bit string 11111 will have a fitness value of 0, as the sum of the subset it represents (26) is larger than the target value *T*=22. These ideas of a genetic representation and a fitness function are illustrated in the table below.

**An example of a genetic representation and fitness function.** Bit strings (genotype) represent subsets (phenotype) which are assigned a fitness value according to the given fitness function.

**Population**

Finally, we need to construct an *initial population* of candidate solutions (subsets), as represented by their corresponding bit strings. For this, we can simply choose a certain number (say 100) of random bit strings. The individuals in the population are then assigned a fitness value using the given fitness function. For problems with *N*=100 or even more, a population of size 100 represents only a tiny little fraction of all 2^{N} possible subsets.

**Selection**

With our (random) initial population constructed, we can now let this population evolve as follows. At discrete time steps, or *generations*, a new population is constructed by selecting *parents* from the current population, and creating *offspring* by (randomly) recombining and mutating these parents’ genotypes. Parents are selected from the current population with probabilities proportional to their fitness values, and with replacement. In other words, individuals with higher fitness values (relative to the population average) have a higher chance of being selected (and possibly more than once), while individuals with low fitness values may not be selected at all.

**Crossover**

For each next pair of selected parents, two offspring individuals are created in a two-step procedure. The first step is *crossover*, in which the genotypes of the selected parents are randomly recombined. Suppose the current pair of parents are 00000 and 11111. After choosing a random *crossover point*, the parts before and after this crossover point in these two genotypes are exchanged to create two offspring individuals. For example, if the crossover point happens to be (randomly) chosen between the second and third bits, the two offspring individuals will be 00111 and 11000.

**Mutation**

The second step is *mutation*. Before the newly created offspring are placed in the new population, forming the next generation, occasionally (i.e., with low probability) a random bit is flipped. For example, the offspring individual 00111 may be mutated to 001**0**1. In this case, the fourth bit (in bold) has been mutated from a 1 to a 0. These ideas of crossover and mutation are illustrated in the figure below.

**An example of crossover and mutation on bit string genotypes.** The crossover point is (randomly) chosen between the second and third bits (dashed vertical line), and by chance the fourth bit of the first offspring individual is mutated (in bold).

Once the new population has been filled up with the offspring individuals, fitness values are again assigned, and the process of selection, crossover, and mutation is repeated to create subsequent generations. This way, better and better (approximate) solutions to the given problem can evolve over time. Therefore, this procedure is known as an *evolutionary algorithm*, and can be easily implemented on a computer.

### An example

Consider again the optimization version of the subset sum problem as described above, with *N*=100, the *a _{i}* values being random numbers between 1 and 1000 (drawn uniformly and independently), and a target value

*T*=45000. So, the problem consists of finding a subset among the

*N*=100 random values of which the sum is as close as possible to

*T*=45000, but not larger.

Implementing an evolutionary algorithm for this problem, and running it for 50 generations on a population consisting of 100 bit strings, each of length *N*=100, gives a result as shown in the figure below. The main (larger) plot shows how the best fitness value in the population changes over time, as indicated by the blue line. Starting from a best fitness (subset sum) of around 30000 in the random initial population, it quickly increases (largely due to crossover), eventually resulting in a fitness that is just short of the target value *T*=45000 (indicated by the dotted line in the zoomed-in plot in the inset).

**Evolving an approximate solution to the subset sum problem.** The main (larger) plot shows the best fitness (blue line) in the population over time, starting from a random initial population of size 100, for a total of 50 generations. The (smaller) inset shows a zoomed-in plot of the last 20 generations.

Note that the fact that the final fitness does not quite equal the target value *T*=45000 does not necessarily mean that there are no subsets that have a sum exactly equal to this target value. For example, as the inset in the figure shows, small improvements (mostly due to mutation) were still found in the later generations. Perhaps if the evolutionary algorithm was run for even longer, a fitness value equal to *T* might have been found. But of course we cannot be sure, and it is up to us to decide for how long to run the algorithm.

Finally, to show that this approximate (but very good!) solution is indeed due to the evolutionary process, the dotted line in the main plot (at a fitness value around 37600) represents the best fitness found in a random subset sample of size 5000. This sample size is equal to the total number of fitness function evaluations performed during the run of the evolutionary algorithm (100 individuals in the population times 50 generations). The best solution from the random sample is far below the best solution found by the evolutionary algorithm, even though both methods were allowed the same number of fitness function evaluations. This clearly shows the power of the evolutionary approach in solving hard optimization problems.

### Evolutionary algorithms in practice

Several variants of evolutionary algorithms were introduced independently already in the 1960s and 1970s. They went by similar but slightly different names, such as genetic algorithm, genetic programming, evolutionary programming, or evolutionary strategy. The main differences between these variants are in the type of genetic representation they use, or whether they use both crossover and mutation, or mutation only. However, what they all have in common is that they evolve a population of candidate solutions subject to fitness-proportional selection and some form of replication with mutation. Currently, they are commonly known under the umbrella-term *evolutionary algorithms*.

Their use and popularity increased drastically during the 1980s when computers became fast enough and easy enough to program to apply these algorithms to large-scale real-world problems. Early successes were obtained in areas such as gas-pipeline optimization and many problems in logistics and combinatorial optimization. However, they were also successfully applied to engineering problems, such as the design of airplane wings or antenna shapes. In fact, for many of these problems the evolutionary algorithms found (approximate) solutions that outperformed those designed by human engineers! These days, evolutionary algorithms are a standard item in any computer scientist’s toolkit.

**An evolved antenna.** An oddly shaped antenna for NASA’s ST-5 mission, evolved by an evolutionary algorithm. Image source: lunar.org.

Many variations on the basic operators of selection, crossover, and mutation have been proposed in the literature over the years, some of which increase the performance of the algorithms even more, depending on the problem at hand. Some of these proposed variations are explicitly non-biological and are purely aimed at making the computer algorithms more efficient, faster, or able to find even better solutions. Of course, this is perfectly fine (and even desirable) when these algorithms are strictly used as problem solvers. However, the basic evolution-inspired algorithm is also used by scientists as a computational model of real biological evolution to study some of its properties and consequences. An example of this will be presented in a next article.

Of course, this cross-fertilization between different scientific fields (in this case biology and computer science) is very interesting and encouraging, but can also lead to some confusion. For example, some readers may have noticed the somewhat unusual use of the term “fitness” in evolutionary algorithms. In evolutionary biology, fitness is a measure of reproductive success (measured, e.g., in number of offspring). In contrast, in evolutionary algorithms fitness is a measure of how well a candidate solution solves the given problem, which then (stochastically) *determines* reproductive success through the selection operator. This happens more often when researchers from one scientific field start using terminology from another field, perhaps not always fully understanding the correct meaning. But by now this alternative use of the term “fitness” in evolutionary algorithms is stuck, and most likely will not change anymore. Biologists will simply have to live with it!

For a detailed introduction and overview of evolutionary algorithms, the book *An Introduction to Genetic Algorithms* by computer scientist Melanie Mitchell is highly recommended. It provides an easy-to-read and non-technical overview of the basic ideas and application areas of genetic algorithms, the best-known variant of an evolutionary algorithm (which is also the variant described and used here).

Finally, I will leave you with what I still consider one of the most original and beautiful examples of the usefulness of evolution as a problem solver. In the early 1990s, computer scientist Karl Sims used one of the most powerful supercomputers available at that time, a Connection Machine CM-5 of the Thinking Machines Corporation, and programmed an evolutionary algorithm to evolve simple creatures that could walk, swim, or jump in a virtual world with realistic physical properties (such as gravity, friction, or viscosity). His most interesting results were collected in a short movie with spoken commentary and explanations. Behold the power of evolution…