Warning
This page is located in archive. Go to the latest version of this course pages. Go the latest version of this page.

Tuning the weights of neural networks

Neural networks, especially the deep ones, are very popular these days. To train them, we usually use the gradient descent (GD) algorithm where the gradient w.r.t. the weights is computed by the error backpropagation (BP) algorithm. Recently, there were some attempts to train the NN weights using Evolution Strategies (ES). Some of these works claim that the training using ES can be similarly efficient as the training using GD+BP (see e.g. https://openai.com/blog/evolution-strategies/). The goal of this topic is to try the tuning of NN weights using local search and evolutionary algorithms on an interesting task.

This topic does not have a precise specification. It is part of the task to come up with a specification in the style of the other topics, and have it authorized by a teacher!

We are aware that OpenAI in the above-mentioned article applied a big computational power. Although it is not needed to use the backpropagation when tuning the parameters using LS or EA, the number of weigts to be tuned may be quite an obstacle for their application. But the task does not have to be solved in its generality; you can e.g.

  • choose a less demanding problem where a smaller network should be sufficient, or
  • use one of the standard pre-trained neural networks and fine-tune only the last layers for the task at hand.

Ideally, the results should contain a comparison of the tuning by GD+BP, local search and an evolutionary algorithm (if possible).

courses/a0m33eoa/semestral_tasks/nn_tuning/start.txt · Last modified: 2021/09/20 12:26 by xposik