Chapter 4

Evolutionary algorithms

When you look around at life on Earth, nothing popped into existence fully formed. Everything alive today is the result of a long chain of tiny changes that accumulated over millions of years. This implies that the physical and cognitive characteristics of every living organism are a result of fitting to its environment for survival. We often make the mistake of thinking evolution is a neat line from “primitive ancestor” to “modern form”. In reality, evolution is messy and branching. Offspring are not perfect clones of their parents; they inherit a mix of genes with small random changes (mutations). At any moment, a species is actually a cloud of variants, not one single clean shape. You only see big, obvious differences when you zoom far out in time and compare averages across thousands of generations. So, evolution is both simple and wild: variation is constantly produced, most variants disappear, and some variants thrive. In doing so, nature essentially “searches” enormous spaces of possibility for the best fit. Evolutionary algorithms copy exactly this logic to search huge solution spaces in computing: generate diverse candidates, select the better ones, and iterate over generations.

Evolution encompasses the idea that in a population of a species, pairs of organisms reproduce. The offspring are a combination of the parents’ genes, but small changes are made in that offspring through a process called mutation. Then the offspring become part of the population. Not all members of a population live on, however. As we know, disease, injury, and consumption by predators cause individuals to die. Individuals that are more adaptive to the environment around them are more likely to live on, a situation that gave rise to the term survival of the fittest. Based on Darwinian evolution theory, a population has the following attributes:

  • Variety — Individuals in the population have different genetic traits.
  • Hereditary — A child inherits genetic properties from its parents.
  • Selection — A mechanism that measures the fitness of individuals.

Stronger individuals have the highest likelihood of survival (survival of the fittest).

The Knapsack Problem

Consider the Knapsack Problem, a classic problem used in computer science to explore how algorithms work and how efficient they are. In the Knapsack Problem, a knapsack has a specific maximum weight that it can hold. Several items are available to be stored in the knapsack, and each item has a different weight and value. The goal is to fit as many items into the knapsack as possible so that the total value is maximized and the total weight does not exceed the knapsack’s limit. The physical size and dimensions of the items are ignored in the simplest variation of the problem.

One way to solve this problem is to use a brute-force approach. This means calculating every possible combination of items and determining the value of each combination that satisfies the knapsack’s weight constraint for all possible combinations, then picking the best solution.

That sounds reasonable until the search space grows. With 26 items, there are over 67 million possible item combinations. For larger optimization problems, brute force stops being a practical option very quickly, which is exactly why heuristic search strategies such as genetic algorithms are useful.

Genetic Algorithm

The genetic algorithm is a specific algorithm in the family of evolutionary algorithms. Each algorithm works on the same premise of evolution but has small tweaks in the different parts of the life cycle to cater to different problems. Genetic algorithms are used to evaluate large search spaces for a good solution. It is important to note that a genetic algorithm is not guaranteed to find the absolute best solution; it tries to find the absolute best while avoiding local best solutions, but may settle on a good but not global best one. A global best is the best possible solution, and a local best is a solution that is less - optimal. If the goal was to maximize a solution, the larger the value, the better it would be. Optimization algorithms like genetic algorithms aim to incrementally find local best solutions in search of the global best solution.

The general life cycle of a genetic algorithm is:

  • Creating a population — Creating a random population of potential solutions.
  • Measuring fitness of individuals in the population — Determining how good a specific solution is. This task is accomplished by using a fitness function that scores solutions to determine how good they are.
  • Selecting parents based on their fitness — Selecting pairs of parents that will reproduce offspring.
  • Reproducing individuals from parents — Creating offspring from their parents by mixing genetic information and applying slight mutations to the offspring.
  • Populating the next generation — Selecting individuals and offspring from the population that will survive to the next generation.

Solve with a Genetic Algorithm

Let’s play with a simple example to understand how a genetic algorithm works. To fill the knapsack with the maximum value, we can use a genetic algorithm. Start by trying the manual knapsack yourself: pick items that maximize value while keeping the total weight below the limit. Then start the algorithm and compare your attempt with the evolving population. As generations pass, watch the best-ever tracker, the chart, and the chromosomes in the population to see how selection, crossover, and mutation gradually improve the candidate solutions.

Try changing the population size, crossover rate, and mutation rate to see how the search behaves. High crossover mixes solutions more aggressively, while mutation injects fresh variation. Tapping a chromosome lets you inspect exactly which items make up that candidate, which is often the easiest way to see evolution working rather than just reading a score.

This toy keeps the problem small enough to understand visually, but the real strength of genetic algorithms appears when the search space is too large, too messy, or too constrained for a clean analytical solution.

0
Weight 0
No valid candidate yet.
Generation 1
Average 0
Valid 0
Best value Weight of best value Best value so far
Used / Capacity
0 / 3,000,000
Value
0

Choose items to minimize weight and maximize value

Evolutionary Algorithms Frequently Asked Questions (FAQ)

What is a genetic algorithm?

A genetic algorithm is an optimization method inspired by evolution. It improves a population of candidate solutions over time using selection, crossover, and mutation.

What is the knapsack problem in this chapter?

The knapsack problem asks you to choose a set of items with the highest value without exceeding a weight or capacity limit. It is a classic optimization problem because the number of possible combinations grows very quickly.

Why is the knapsack problem hard?

It is difficult because every item choice affects the remaining capacity and the value of the final solution. As the number of items grows, brute-forcing all combinations quickly becomes impractical.

What does a population mean in a genetic algorithm?

A population is a collection of candidate solutions being evaluated at the same time. Instead of improving one answer directly, the algorithm evolves many answers together.

What is a fitness function?

A fitness function scores how good a candidate solution is for the problem you care about. In the knapsack demo, it rewards high value while punishing overweight solutions.

What is crossover in a genetic algorithm?

Crossover combines parts of two parent solutions to create offspring. The idea is that useful traits from strong candidates can be recombined into even better candidates.

What is mutation in a genetic algorithm?

Mutation introduces small random changes into solutions. This helps preserve diversity and gives the algorithm a way to explore options that crossover alone might never create.

Why do genetic algorithms use randomness?

Randomness helps them search broadly instead of getting stuck too early. It supports both initial diversity and ongoing exploration during evolution.

When are genetic algorithms useful?

Genetic algorithms are useful when the search space is large, messy, or hard to solve exactly with brute force. They are often used when a good solution is valuable even if the perfect answer is expensive to compute.

Do genetic algorithms guarantee the optimal solution?

No, not in general. They are heuristic optimization methods, which means they often find strong answers efficiently but do not guarantee the absolute best solution every time.

What does the chapter simulation help you understand?

It turns the abstract loop of population, selection, crossover, and mutation into something visible. You can watch how better candidates emerge over generations and compare them to your own manual choices.