===== Syllabus ===== ^Lecture ^Date ^Topic ^Lecturer ^Pdf ^Notes ^ |1.|22. 9.| Introduction | BF | {{:courses:be4m33ssu:stat_mach_learn_l01.pdf| }} | | |2.|29. 9.| Empirical risk | VF | {{ :courses:be4m33ssu:er_zs2020.pdf | }} (printable {{ :courses:be4m33ssu:er_zs2020_print.pdf | }}) | [1] Chap 2, [2] Chap 7 | |3.|6. 10.| Empirical risk minimization | VF | {{ :courses:be4m33ssu:erm_zs2020.pdf | }} (printable {{ :courses:be4m33ssu:erm_zs2020_print.pdf | }})| [1] Chap 2, [2] Chap 7 | |4.|13. 10.| Support Vector Machines I | VF | {{ :courses:be4m33ssu:svm1_ws2020.pdf | }} (printable {{ :courses:be4m33ssu:svm1_ws2020_print.pdf | }}) | [1] Chap 4, [2] Chap 12 | |5.|20. 10.| Support Vector Machines II | VF | {{ :courses:be4m33ssu:svm2_ws2020.pdf | }} (printable {{ :courses:be4m33ssu:svm2_ws2020_print.pdf | }}) | [1] Chap 5, [2] Chap 12 | |6.|27. 10.| Supervised learning for deep networks | JD | {{ :courses:be4m33ssu:anns_w2020.pdf | }} | | |7.|3. 11.| SGD, Deep (convolutional) networks | JD | {{ :courses:be4m33ssu:sgd_w2020.pdf | }} {{ :courses:be4m33ssu:deep_anns_w2020.pdf | }} | | |8.|10. 11.| Generative learning, EM algorithm | BF | {{:courses:be4m33ssu:ml-em.pdf| }} | | |9.|17. 11.| National holiday | | | | |10.|24. 12.| Bayesian learning | BF | {{:courses:be4m33ssu:bayes-learn-ws2020.pdf| }} | | |11.|1. 12.| Hidden Markov Models | BF | {{:courses:be4m33ssu:hmms-ws2020.pdf| }} | | |12.|8. 12.| Markov Random Fields | BF | {{ :courses:be4m33ssu:mrfs-ws2020.pdf | }} | | |13.|15. 12.| Ensembling I | JD | {{ :courses:be4m33ssu:ensembling_w2020.pdf | }} | | |14.|5. 1.| Ensembling II | JD | | |