[[https://fel.cvut.cz/cz/education/rozvrhy-ng.B211/public/html/predmety/43/58/p4358506.html|RPZ Schedule]] [[https://cw.felk.cvut.cz/forum/forum-1728.html|Discussion forum]] ===== Summary ===== This course introduces statistical decision theory and surveys canonical and advanced classifiers such as perceptrons, AdaBoost, support vector machines, and neural nets. ===== Basic info ===== ** Winter semester 2021/2022 ** ** Where and when:** KN:E-126 at [[ http://cyber.felk.cvut.cz/contact/#maps | Building G, Karlovo namesti]], Monday 12:45-14:15 ** Teaching: ** [[http://cmp.felk.cvut.cz/~matas | Jiří Matas]] (JM) , [[http://cmp.felk.cvut.cz/~drbohlav| Ondřej Drbohlav]] (OD) ===== Lecture plan 2021/2022 ===== ^ Week ^ Date ^ Lect. ^ Slides ^ Topic ^ Wiki ^ Additional material ^ | 1 | 20.9. |JM| {{pr_01_intro_and_bayes_2021.pdf|pdf}} | Introduction. Basic notions. The Bayesian recognition problem |[[https://en.wikipedia.org/wiki/Machine_learning|Machine_learning]] [[https://en.wikipedia.org/wiki/Naive_Bayes_classifier|Naive_Bayes_classifier]]| {{:courses:b4b33rpz:pr_01_extra.pdf|some simple problems}} | | 2 | 27.9. |JM| {{.pr_02_non_bayes_2021.pdf|pdf}} | Non-Bayesian tasks |[[https://en.wikipedia.org/wiki/Minimax|Minimax]] | | | 3 | 4.10. |JM| {{:courses:b4b33rpz:pr_03_parameter_estimation_2020_10.pdf|pdf}} | Parameter estimation of probabilistic models. Maximum likelihood method |[[http://en.wikipedia.org/wiki/Maximum_likelihood|Maximum_likelihood]] | | | 4 | 11.10. |JM| {{.pr_04_non_parametric_knn_2020.pdf|pdf}} | Nearest neighbour method. Non-parametric density estimation. | [[http://en.wikipedia.org/wiki/K-nearest_neighbor_algorithm|K-nearest_neighbor_algorithm]] | | | 5 | 18.10. |JM| {{ .pr_05_logistic_regression_2021.pdf |pdf}} | Logistic regression |[[https://en.wikipedia.org/wiki/Logistic_regression|Logistic_regression]] | | | 6 | 25.10. |JM| {{.pr_06_perceptron_2020.pdf|pdf}} | Classifier training. Linear classifier. Perceptron. |[[https://en.wikipedia.org/wiki/Linear_classifier|Linear_classifier]] [[https://en.wikipedia.org/wiki/Perceptron|Perceptron]] | | | 7 | 1.11. |JM| {{ :courses:b4b33rpz:pr_07_svm_2018.pdf |pdf}} | SVM classifier |[[https://en.wikipedia.org/wiki/Support_vector_machine|Support_vector_machine]] | | | | 8 | 8.11. |JM| {{.pr_08_adaboost_2017.pdf|pdf}} | Adaboost learning |[[https://en.wikipedia.org/wiki/AdaBoost|Adaboost]] | | | 9 | 15.11. |JM| {{:courses:b4b33rpz:neural_networks_2020.pdf|pdf}} | Neural networks. Backpropagation |[[https://en.wikipedia.org/wiki/Artificial_neural_network|Artificial_neural_network]] | | 10 | 22.11. |JM| {{:courses:b4b33rpz:pr_10_k_means_2015_12_04.pdf|pdf}} | Cluster analysis, k-means method |[[https://en.wikipedia.org/wiki/K-means_clustering|K-means_clustering]] [[https://en.wikipedia.org/wiki/K-means%2B%2B|K-means++]] | | | 11 | 29.11. |JM| {{ :courses:be5b33rpz:lectures:em_2020.pdf |pdf}} | EM (Expectation Maximization) algorithm. |[[https://en.wikipedia.org/wiki/Expectation%E2%80%93maximization_algorithm|Expectation_maximization_algorithm]] |{{:courses:b4b33rpz:2010.12.10-em-hoffmann.pdf|Hoffmann}},{{:courses:b4b33rpz:2010.12.10-em-bishop.pdf|Bishop}}, {{.flach-2013.12.02-em_algorithm.pdf|Flach}}| | 12 | 6.12. |JM| {{.pr_12_pca_2017_01_02.pdf|pdf}} | Feature selection and extraction. PCA, LDA. |[[https://en.wikipedia.org/wiki/Principal_component_analysis|Principal_component_analysis]] [[https://en.wikipedia.org/wiki/Linear_discriminant_analysis|Linear_discriminant_analysis]] | Optimalizace (CZ): [[https://cw.fel.cvut.cz/wiki/_media/courses/b0b33opt/05aplikace.pdf|PCA slides]], [[https://cw.fel.cvut.cz/wiki/_media/courses/b0b33opt/opt.pdf| script 7.2]] | | 13 | 13.12. |JM| {{.pr_13_dec_trees_2017_01_09.pdf|pdf}} | Decision trees. |[[https://en.wikipedia.org/wiki/Decision_tree|Decision_tree]] [[https://en.wikipedia.org/wiki/Decision_tree_learning|Decision_tree_learning]] |[[http://ocw.mit.edu/courses/sloan-school-of-management/15-097-prediction-machine-learning-and-statistics-spring-2012/lecture-notes/MIT15_097S12_lec08.pdf|Rudin@MIT]] | | 14 | 3.1. | JM | Basic notions recapitulation, links between methods, answers to exam questions )| /* Old PCA links: [[http://www.csd.uwo.ca/~olga/Courses/CS434a_541a/Lecture8.pdf|Veksler]], {{.pca-2016.01.15-franc.pdf|Franc}}, {{.lda_2014_06_08.pdf |ver1}} */ ===== Recommended literature ===== * Duda R.O., Hart, P.E.,Stork, D.G.: Pattern Classification, John Willey and Sons, 2nd edition, New York, 2001 * Schlesinger M.I., Hlaváč V.: Ten Lectures on Statistical and Structural Pattern Recognition, Springer, 2002 * Bishop, C.: Pattern Recognition and Machine Learning, Springer, 2011 * Goodfellow, I., Bengio, Y. and Courville, A.: Deep Learning, MIT Press, 2016. [[http://www.deeplearningbook.org/|www]] /* ** Further resources ** * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/ROD-UvodRozpozn.pdf|Introduction to recognition]], V. Hlaváč (Czech only) * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/PravdepStatistikaOpakov.ppt|Probability and statistics overview]] (Czech only) * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/ReceiverOperCharact.pdf|ROC curve]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/LP-Theory.ppt|Linear programming]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/DualTaskLinearProgramming.pdf|Dual tasks in linear programming]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/P33ROD-2StatModels.pdf|Two statistical models]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/UceniBezUcitele.pdf|Unsupervised learning]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/Bishop-ECCV-04-tutorial-B.ppt|Unsupervised learning]] (lecture by Ch. Bishop) * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/UmeleNN.ppt|Artificial neural networks]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/MarkovianPR.pdf|Markov sequences]] * [[http://cmp.felk.cvut.cz/%7Ehlavac/Public/TeachingLectures/HMMalaRabiner.pdf|Markov sequences]] (tutorial by Rabiner) */ ===== Assessment (zápočet) ===== Conditions for assessment are in the [[../labs/|lab section]]. ===== Exam ===== * Only students who receive all credits from the lab work and are granted the assessment ("zápočet") can be examined. * The labs contribute 50% to your final evaluation, the written part of the exam contributes 40% and the oral part contributes 10%. * A threshold for passing the exam is set, usually between 5-10 points (out of 40), depending on the complexity of the test. * Your grade chances after the written exam test and before the oral exam are illustrated in the image below. Beware: The scheme is **approximate only**, but illustrates how the oral exam influences the final grade. * The questions used in the test are available [[http://cmp.felk.cvut.cz/cmp/courses/recognition/Exam-questions|here]] (if one can solve these questions, one will likely do well on the exam) * Oral part starts approximately 2 hours after the end of the test (the interim time is used to correct the tests), or if the number of students is large, the following day.. It contributes to the final evaluation by 10%. * The oral part is compulsory! * Example oral exam questions are available [[http://cmp.felk.cvut.cz/cmp/courses/recognition/Exam-questions/exam-questions-eng.pdf|here]]. {{ :courses:be5b33rpz:lectures:rpz_oral_chances.png?nolink&400 |}} /*===== Exam dates ===== * written part: January 22, 2020 in lecture room K1, 12:00 * oral part: January 23 (the location will be specified) */