Warning
This page is located in archive. Go to the latest version of this course pages. Go the latest version of this page.

Multi-objective EAs

The goal of this lab exercise is to implement a multi-objective evolutionary algorithm (MOEA), in particular the NSGA-II one.

As you will see, it consists in modifying just a few parts of the standard evolutionary algorithm that you implemented in the last lab exercise.

If you do not complete all parts during the lab time, finish the work at home.

Non-dominated fronts computation

The first modification is related to the way two candidate solutions are compared w.r.t. their quality. In case of a single-objective optimization, the comparison of two solutions is based just on a single fitness value assigned to each solution. Simply, the smaller the fitness value, the better the solution is in case of the minimization problem. Similarly, but using the “larger” relation for the maximization problem.

It is not that straightforward in case of multi-objective optimizations because there are multiple quality measures assigned to each solution. Various strategies for comparing two solutions are used in various MOEAs. NSGA-II algorithm compares solutions based on two performance indicators derived from the original objective values [please, refer to the lecture slides for details]:

  • non-dominated front number,
  • uniqueness of the solution within its non-dominated front (i.e., the crowding distance).

Your first task is to implement a function/method that calculates the non-dominated front number and the crowding distance for all individuals in the population.

Input Evaluation of the whole population, i.e., set of vector of objective values for each solution.
Output The non-dominated front number and crowding distance for each solution.

The original test data were wrong. Now we are at attempt two!

Test data (thanks to Jan Pikman): small, large. All objectives are minimized.

If your results are different, email to Petr Pošík.

Binary tournament operator

NSGA-II uses a binary tournament operator $t(x_i, x_j)$ that returns true, meaning the $x_i$ is better than $x_j$, if one of the following conditions holds:

  • $x_i$ is from the better front than $x_j$, i.e., $front(x_i) < front(x_j)$,
  • both, $x_i$ and $x_j$ are from the same front, but $x_i$ is more unique than $x_j$ within the front, i.e., $front(x_i) == front(x_j)$ AND $crowding(x_i) > crowding(x_j)$.

Your second task is to implement this binary tournament operator.

Input Two solutions $x_i$ and $x_j$
Output True, if $x_i$ is better than $x_j$. False, otherwise.

Replacement strategy

The last task is to implement the replacement strategy used in NSGA-II. It works so that it takes two populations

  • $P_t$ … the current population at time $t$,
  • $Q_t$ … the population of solutions newly created of the population $P_t$

and selects out of the $P_t \bigcup Q_t$ a new set of the best solutions for the population $P_{t+1}$, please see the lecture slides for details.

Input Populations $P_t$ and $Q_t$
Output Population $P_{t+1}$

Now, you have all ingredients needed to complete the NSGA-II algorithm!

Experimental evaluation

To test your NSGA-II implementation you can use the Multi-objective 0/1 Knapsack Problem. This problem suits well to our purposes as you can use the simple binary representation with the mutation and crossover operators that you have already implemented.

For the definition of a Single-objective 0/1 Knapsack Problem please consult Wikipedia.

The multi-objective version is well described in this article. In MO variant with D-dimensional knapsack, the weight and value of each item $i$ for knapsack $d$ is given by a D-dimensional vector $w_i=(w_{i1},\dots, w_{iD})$ and $v_i=(v_{i1},\dots, v_{iD})$. Similarly, the knapsack has a D-dimensional capacity vector $W = (W_1, \dots, W_D)$. Assuming the solution is represented by a binary vector $x = (x_1, \ldots, x_m)$, the goal is to maximize

$$\text{maximize } f(x) = (f_1(x), \dots, f_D(x)), $$

where

$$f_d(x) = \sum_{i=1}^m v_{id}x_i,$$

such that the sum of weights in each dimension $d$ does not exceed $W_d$, i.e.

$$\forall d \in \{1,\ldots,D\}: \sum_{i=1}^m w_{id}x_i \leq W_d.$$

You can use the Test Problem Suite with various data sets and the best known solutions for each of them available.

In case of any issues downloading the test problems from the site above, you can download the problem definitions and Pareto fronts here.

Data sets in Test Problem Suite use the following notation:

  • n-th knapsack stands for n-th dimension, $n \in \{1,\dots,D\}$.
  • Profit stands for the value.
courses/a0m33eoa/labs/week_06.txt · Last modified: 2021/11/02 16:00 by xposik