Table of Contents

RPZ Schedule RPZ students Discussion forum

Summary

This course introduces statistical decision theory and surveys canonical and advanced classifiers such as perceptrons, AdaBoost, support vector machines, and neural nets.

Basic info

Winter semester 2018/2019

Where and when: KN:G-205 at Building G, Karlovo namesti, Monday 12:45-14:15

Teaching: Jiří Matas (JM) matas@cmp.felk.cvut.cz, Ondřej Drbohlav (OD) drbohlav@cmp.felk.cvut.cz, Vojtěch Franc (VF) xfrancv@cmp.felk.cvut.cz, Boris Flach (BF) flachbor@cmp.felk.cvut.cz.

Lecture plan 2018/2019

Week Date Lect. Slides Topic Wiki Additional material
1 1.10. JM pdf Introduction. Basic notions. The Bayesian recognition problem Machine_learning Naive_Bayes_classifier some simple problems
2 8.10. JM pdf Non-Bayesian tasks Minimax
3 15.10. JM pdf Parameter estimation of probabilistic models. Maximum likelihood method Maximum_likelihood
4 22.10. OD pdf Nearest neighbour method. Non-parametric density estimation. K-nearest_neighbor_algorithm
5 29.10. JM pdf Logistic regression Logistic_regression
6 5.11. JM pdf Classifier training. Linear classifier. Perceptron. Linear_classifier Perceptron
7 12.11. JM pdf SVM classifier Support_vector_machine
8 19.11. OD pdf Adaboost learning Adaboost
9 26.11. JM pdf, pdf Neural networks. Backpropagation Artificial_neural_network
10 3.12. JM pdf Cluster analysis, k-means method K-means_clustering K-means++
11 10.12. JM pdf Unsupervised learning. EM (Expectation Maximization) algorithm. Expectation_maximization_algorithm Hoffmann,Bishop, Flach
12 17.12. JM pdf Feature selection and extraction. PCA, LDA. Principal_component_analysis Linear_discriminant_analysis Veksler, Franc, ver1
13 31.12. (holiday, no lecture)
14 7.1. JM pdf Decision trees. Decision_tree Decision_tree_learning Rudin@MIT

Exam