Warning

This page is located in archive.
Go to the latest version of this course pages.
Go the latest version of this page.

The goal of this lab exercise is the implementation of an evolutionary algorithm and its application to optimization problems defined in previous lab exercises. The specification of individual EA parts is rather conceptual: they can be implemented in many different ways; use the following description as a specification of the abilities that the implementation should posses, rather than a direct specification of the implementation.

** If you do not complete all parts during the lab time, finish the work at home. **

The main difference between EA and local search is the fact that the algorithm state is represented by a whole set of candidate solutions, not by a single candidate solution. Think about the representation of the population. Evaluate pros and cons of the following options:

- list of individuals,
- 2D array (e.g., in MATLAB, numpy, etc.),
- specialized class representing a population.

Implement a function that will create an initialized population of candidate solutions. The output, of course, depends on the chosen representation. The function may (and should) take advantage of the functions for a single solution initialization, that we implemented in the previous lab exercises.

Input | Population size |
---|---|

Input | Function that creates a single initialized candidate |

Output | Initialized population of candidate solutions |

If you will not manage to create a universal initialization function, that's OK. You will just have to create more of them, more specialized ones.

Implement a function for parent selection.

Input | Number of parents to be selected |
---|---|

Input | Fitness values of the population members. |

Input | Parameters of the particular selection type (if any). |

Output | Indices of the population members that shall become parents. (Some of them may be present more than once, some of them may be missing.) |

Choose one selection type and implement it first. The others can wait. You can try:

- Tournament selection.
- Rank selection.
- Roulette-wheel selection (preferably using stochastic universal sampling).
- Random selection.
- …

Implement a function for crossover. The crossover type will depend on the representation. Think about inputs and outputs (how many parents will enter the function, how many offspring will be generated).

Input | List of parents |
---|---|

Output | One or more offspring individuals. |

Again, choose one type of crossover and implement it first. The others can wait. You can try

- Single-point crossover (should work for both binary and real representation).
- Two-point crossover (should work for both binary and real representation).
- Uniform crossover (should work for both binary and real representation).
- Arithmetic crossover (for real representation).
- “Blend” crossover (for real representation).
- …

Do not bother with mutation. Perturbation functions we implemented in previous exercises can be directly used as mutation operators.

Implement a function for replacement strategy.

Input | Old evaluated population (or old candidate solutions and their fitness values). |
---|---|

Input | Evaluated population of new offspring (or new offspring and their fitness values). |

Output | Evaluated population of survived individuals (or survived individuals and their fitness values). |

Choose one type of replacement strategy and implement it first. The others can be implemented as needed. You can try:

- Generational replacement (old population is thrown away, all offspring survive).
- Steady-state replacement:
- Random (join old and new population and choose survivors randomly).
- Truncation (join old and new population and throw away the worst individuals).
- Turnament.
- Roulette.
- Rank-based.

When choosing the parental selection and replacement strategy for your EA, at least one of these two functions should provide some selection pressure, i.e., prefer better individuals. Otherwise, your EA will work as a random search.

Implement a function/class that will represent the evolutionary algorithm.

Input | Objective function to be minimized. |
---|---|

Initialization function. | |

Selection function. | |

Crossover function. | |

Probability of crossover. | |

Mutation function. | |

Probability of mutation. | |

Replacement strategy. | |

Termination conditions. | |

Output | The best solution found and its function value. |

Statistics from the optimization run. |

Think whether it is better to provide many input parameters or whether it wouldn't be better to use some data structure representing an EA configuration that would be given to the solver as a single argument (almost).

Try to apply the visualization tools from previous exercises to the outputs of LS and EA and try to compare both types of algorithms when applied on the same optimization problem.

Try to compare both types of algorithms on various binary and real problems.

- What results would you expect on unimodal problems? Do the observed results meet your expectations?
- What results would you expect on more complex, multimodal, or deceptive problems? Do the observed results meet your expectations?

courses/a0m33eoa/labs/week_04.txt · Last modified: 2021/10/08 16:52 by xposik