Warning
This page is located in archive.

Assignment: Neural Network

📅 Deadline: 11.12.2024 21:59

🏦 Points: 8

Note: If you downloaded the template before November 7th at 11:50, please download the updated version. The previous version erroneously evaluated the Softmax layer instead of the Cross-entropy tests. In the case that you find any mistakes, please contact Jakub Paplhám at paplhjak@fel.cvut.cz

Task Description

In this assignment, you are tasked with implementing the forward, backward, and parameter messages required for training of a simple neural network. You can find the complete description of the assignment in the Assignment PDF.

You are provided with a template containing the following files that you do not need to modify:

  • datasets.py: Loads dummy datasets. This script is not used in this homework. You do not need to modify this file.
  • main.py: This file includes functions that run your code, compare it against the reference, and save the results. You do not need to modify this file.
  • requirements.txt: PIP requirements for the homework, however, most of your existing environments should be sufficient.
  • run.py: Runs `train.py` to train the neural network on dummy datasets. This script is not used in this homework. You do not need to modify this file.
  • train.py: Trains the neural network. This script is not used in this homework. You do not need to modify this file.
  • test-cases: A folder containing public test cases to help you verify your implementation before submitting to BRUTE.
  • utils.py: Contains helper functions for loading and saving data. You do not need to modify this file.

The template also includes the following files which you do need to modify:

  • linear_layer.py: Implements the linear layer.
  • losses.py: Implements the cross entropy loss layer.
  • mlp.py: Implements a multi-layer-perceptron.
  • relu_layer.py: Implements the rectified linear unit layer.
  • softmax_layer.py: Implements the softmax layer.

Your objective is to implement the forward, backward, and parameter messages in the previously mentioned files. You can find them by searching for “TODO” in the python files.

All python files must be stored in the root of the .zip sent for submission.

How to Test

After completing your implementation, you can test your solution using the following commands before submitting it to BRUTE:


Public Test Cases

You can validate your code by running:

python main.py test-cases/public/instances/instance_1.json
python main.py test-cases/public/instances/instance_2.json
python main.py test-cases/public/instances/instance_3.json

For all of these instances, the expected output is:

Linear:
  forward: Test OK
  delta: Test OK
  grad_W: Test OK
  grad_b: Test OK
ReLU:
  forward: Test OK
  delta: Test OK
Softmax:
  forward: Test OK
  delta: Test OK
Cross Entropy:
  forward: Test OK
  delta: Test OK
Cross Entropy For Softmax Logits:
  forward: Test OK
  delta: Test OK
MLP:
  Linear_3 gradient: Test OK
  Linear_2 gradient: Test OK
  Linear_1 gradient: Test OK

Submission Guidelines

  • Submit the completed code as a .zip via BRUTE.
  • All python files must be stored in the root of the .zip sent for submission.
  • Make sure your implementation passes the test cases provided above. Good luck! 😊
courses/be4m33ssu/homeworks/hw_neural_network.txt · Last modified: 2024/11/07 11:59 by paplhjak