===== Syllabus ===== ^Week ^Date ^Topic ^Lecturer ^Pdf ^Notes ^ |1.|23. 9.| **Introduction** | VF | {{ :courses:be4m33ssu:intro_ws2025.pdf | }} | | |2.|30. 9.| **Predictor evaluation. Empirical Risk Minimization ** | VF | {{ :courses:be4m33ssu:predeval_erm_ws2025.pdf | }} | [1] Chap 2, [2] Chap 7 | |3.|7. 10.| **Probably Approximately Correct Learning** | VF | {{ :courses:be4m33ssu:pac_ws2025.pdf | }}| [1] Chap 2, [2] Chap 7 | |4.|14. 10.| **Vapnik-Chervonenkis dimension** | VF | {{ :courses:be4m33ssu:vc_lecture_ws2025.pdf | }} | [1] Chap 4, [2] Chap 12 | |5.|21. 10.| **Supervised learning for deep networks** | JD | {{ :courses:be4m33ssu:anns_ws2025.pdf | }} | | |6.|28. 10.| [[https://en.wikipedia.org/wiki/Czechoslovak_declaration_of_independence | state holidays]] | | | | |7.|4. 11.| **SGD, Deep (convolutional) networks** | JD | {{ :courses:be4m33ssu:sgd_ws2025.pdf |SGD}} {{ :courses:be4m33ssu:deep_anns_ws2025.pdf |deep nets}}| | |8.|11. 11.| **Support Vector Machines** | VF | {{ :courses:be4m33ssu:svm_lecture_ws2025.pdf | }} | | |9.|18. 11.| **Ensembling I** | JD | {{ :courses:be4m33ssu:ensembling_ws2025.pdf | }} | **moved to KN:A-312**, [4] | |10.|25. 11.| **Ensembling II** | JD | | [2] Chap 10 | |11.|2. 12.| **Generative learning, Maximum Likelihood estimator** | VF | {{ :courses:be4m33ssu:gener_lecture_ws2025.pdf | }} | | |12.|9. 12.| **EM algorithm, Bayesian learning** | VF | {{ :courses:be4m33ssu:em_bayesian_lecture_ws2025.pdf | }} | | |13.|16. 12.| **Hidden Markov Models I** | JD | {{ :courses:be4m33ssu:hmms_ws2025.pdf | }} | [5] Chap 17 | |14.|6. 1.| **Hidden Markov Models II** | JD | | |