RPZ Schedule Discussion forum

Summary

This course introduces statistical decision theory and surveys canonical and advanced classifiers such as perceptrons, AdaBoost, support vector machines, and neural nets.

Basic info

Winter semester 2020/2021

Due to the covid-19 situation, the lectures will be given online, via zoom. All students enrolled in KOS will be sent a link at 12:30.

Where and when: KN:G-205 at Building G, Karlovo namesti, Monday 12:45-14:15

Teaching: Jiří Matas (JM) matas@cmp.felk.cvut.cz, Ondřej Drbohlav (OD) drbohlav@cmp.felk.cvut.cz

Lecture plan 2020/2021

Week Date Lect. Slides Topic Wiki Additional material
1 21.9. JM pdf recording Introduction. Basic notions. The Bayesian recognition problem Machine_learning Naive_Bayes_classifier some simple problems
2 28.9. (holiday, no lecture)
3 5.10. JM pdf recording Non-Bayesian tasks Minimax
4 12.10. JM pdf recording Parameter estimation of probabilistic models. Maximum likelihood method Maximum_likelihood
5 19.10. JM pdf recording Nearest neighbour method. Non-parametric density estimation. K-nearest_neighbor_algorithm
6 26.10. JM pdf recording Logistic regression Logistic_regression
7 2.11. JM pdf recording Classifier training. Linear classifier. Perceptron. Linear_classifier Perceptron
8 9.11. JM pdf recording SVM classifier Support_vector_machine
9 16.11. JM pdf recording Adaboost learning Adaboost
10 23.11. JM pdf Neural networks. Backpropagation Artificial_neural_network
11 30.11. JM pdf Cluster analysis, k-means method K-means_clustering K-means++
12 7.12. JM pdf Unsupervised learning. EM (Expectation Maximization) algorithm. Expectation_maximization_algorithm Hoffmann,Bishop, Flach
13 14.12. JM pdf Feature selection and extraction. PCA, LDA. Principal_component_analysis Linear_discriminant_analysis Veksler, Franc, ver1
14 4.1. JM pdf Decision trees. Decision_tree Decision_tree_learning Rudin@MIT
  • Duda R.O., Hart, P.E.,Stork, D.G.: Pattern Classification, John Willey and Sons, 2nd edition, New York, 2001
  • Schlesinger M.I., Hlaváč V.: Ten Lectures on Statistical and Structural Pattern Recognition, Springer, 2002
  • Bishop, C.: Pattern Recognition and Machine Learning, Springer, 2011
  • Goodfellow, I., Bengio, Y. and Courville, A.: Deep Learning, MIT Press, 2016. www

Exam

  • Only students that receive all credits from the lab work and receive (“zápočet”) can be examined. The labs contribute 50% to your final evaluation. Any extra credits beyond 50% will be considered at the final evaluation and may improve your mark.
  • The exam consists of two parts: written test and oral exam.
  • The written test lasts 60-90 minutes and contributes 40% to the final evaluation.
  • The questions used in the test are available here (if one can solve these questions, one will likely do well on the exam)
  • Oral part starts approximately 2 hours after the end of the test (the interim time is used to correct the tests). It contributes to the final evaluation by 10%.
  • To get grade “A” for the course, “A” or “B” result of the final written exam is required.
  • Oral exam questions are available here.
courses/be5b33rpz/lectures/start.txt · Last modified: 2020/11/24 19:39 by matas