This course introduces statistical decision theory and surveys canonical and advanced classifiers such as perceptrons, AdaBoost, support vector machines, and neural nets.
Winter semester 2021/2022
Where and when: KN:E-126 at Building G, Karlovo namesti, Monday 12:45-14:15
Teaching: Jiří Matas (JM) matas@cmp.felk.cvut.cz, Ondřej Drbohlav (OD) drbohlav@cmp.felk.cvut.cz
Week | Date | Lect. | Slides | Topic | Wiki | Additional material | |
---|---|---|---|---|---|---|---|
1 | 20.9. | JM | Introduction. Basic notions. The Bayesian recognition problem | Machine_learning Naive_Bayes_classifier | some simple problems | ||
2 | 27.9. | JM | Non-Bayesian tasks | Minimax | |||
3 | 4.10. | JM | Parameter estimation of probabilistic models. Maximum likelihood method | Maximum_likelihood | |||
4 | 11.10. | JM | Nearest neighbour method. Non-parametric density estimation. | K-nearest_neighbor_algorithm | |||
5 | 18.10. | JM | Logistic regression | Logistic_regression | |||
6 | 25.10. | JM | Classifier training. Linear classifier. Perceptron. | Linear_classifier Perceptron | |||
7 | 1.11. | JM | SVM classifier | Support_vector_machine | |||
8 | 8.11. | JM | Adaboost learning | Adaboost | |||
9 | 15.11. | JM | Neural networks. Backpropagation | Artificial_neural_network | |||
10 | 22.11. | JM | Cluster analysis, k-means method | K-means_clustering K-means++ | |||
11 | 29.11. | JM | EM (Expectation Maximization) algorithm. | Expectation_maximization_algorithm | Hoffmann,Bishop, Flach | ||
12 | 6.12. | JM | Feature selection and extraction. PCA, LDA. | Principal_component_analysis Linear_discriminant_analysis | Optimalizace (CZ): PCA slides, script 7.2 | ||
13 | 13.12. | JM | Decision trees. | Decision_tree Decision_tree_learning | Rudin@MIT | ||
14 | 3.1. | JM | Basic notions recapitulation, links between methods, answers to exam questions ) |
Conditions for assessment are in the lab section.