Search
The goal of this lab exercise is to implement a multi-objective evolutionary algorithm (MOEA), in particular the NSGA-II one.
As you will see, it consists in modifying just a few parts of the standard evolutionary algorithm that you implemented in the last lab exercise.
If you do not complete all parts during the lab time, finish the work at home.
The first modification is related to the way two candidate solutions are compared w.r.t. their quality. In case of a single-objective optimization, the comparison of two solutions is based just on a single fitness value assigned to each solution. Simply, the smaller the fitness value, the better the solution is in case of the minimization problem. Similarly, but using the “larger” relation for the maximization problem.
It is not that straightforward in case of multi-objective optimizations because there are multiple quality measures assigned to each solution. Various strategies for comparing two solutions are used in various MOEAs. NSGA-II algorithm compares solutions based on two performance indicators derived from the original objective values [please, refer to the lecture slides for details]:
Your first task is to implement a function/method that calculates the non-dominated front number and the crowding distance for all individuals in the population.
The original test data were wrong. Now we are at attempt two!
Test data (thanks to Jan Pikman): small, large. All objectives are minimized.
If your results are different, email to Petr Pošík.
NSGA-II uses a binary tournament operator $t(x_i, x_j)$ that returns true, meaning the $x_i$ is better than $x_j$, if one of the following conditions holds:
Your second task is to implement this binary tournament operator.
The last task is to implement the replacement strategy used in NSGA-II. It works so that it takes two populations
and selects out of the $P_t \bigcup Q_t$ a new set of the best solutions for the population $P_{t+1}$, please see the lecture slides for details.
Now, you have all ingredients needed to complete the NSGA-II algorithm!
To test your NSGA-II implementation you can use the Multi-objective 0/1 Knapsack Problem. This problem suits well to our purposes as you can use the simple binary representation with the mutation and crossover operators that you have already implemented.
For the definition of a Single-objective 0/1 Knapsack Problem please consult Wikipedia.
The multi-objective version is well described in this article. In MO variant with D-dimensional knapsack, the weight and value of each item $i$ for knapsack $d$ is given by a D-dimensional vector $w_i=(w_{i1},\dots, w_{iD})$ and $v_i=(v_{i1},\dots, v_{iD})$. Similarly, the knapsack has a D-dimensional capacity vector $W = (W_1, \dots, W_D)$. Assuming the solution is represented by a binary vector $x = (x_1, \ldots, x_m)$, the goal is to maximize
$$\text{maximize } f(x) = (f_1(x), \dots, f_D(x)), $$
where
$$f_d(x) = \sum_{i=1}^m v_{id}x_i,$$
such that the sum of weights in each dimension $d$ does not exceed $W_d$, i.e.
$$\forall d \in \{1,\ldots,D\}: \sum_{i=1}^m w_{id}x_i \leq W_d.$$
You can use the Test Problem Suite with various data sets and the best known solutions for each of them available.
In case of any issues downloading the test problems from the site above, you can download the problem definitions and Pareto fronts here.
Data sets in Test Problem Suite use the following notation: