Warning
This page is located in archive. Go to the latest version of this course pages. Go the latest version of this page.

Lectures

Lectures are given in person. Videos will be provided, but not necessarily directly from the lectures (the form is at the discretion of the individual lecturers).

Literature

In the lecture descriptions below, we refer to this supplementary course material:

Relevant

  • RL: R. S. Sutton, A. G. Barto: Reinforcement learning: An introduction. MIT press, 2018.
  • NLP: D. Jurafsky & J. H. Martin: Speech and Language Processing - 3rd edition draft
  • COLT: M. J. Kearns, U. Vazirani: An Introduction to Computational Learning Theory, MIT Press 1994

RL & NLP are available online.

You are strongly discouraged from using this course's materials from previous years as you would run into confusions.

The RL part of the course is heavily based on the RL course of prof Emma Brunskill. The relevant lectures from prof Brunskill's course are: Lecture 1,Lecture 2, Lecture 3, Lecture 4, Lecture 5, Lecture 6, Lecture 11.

There are nice materials by Volodymyr Kuleshov and Stefano Ermon on probabilistic graphical models (for the Bayesian networks part of the course): https://ermongroup.github.io/cs228-notes/. The relevant chapters are: https://ermongroup.github.io/cs228-notes/representation/directed/, https://ermongroup.github.io/cs228-notes/inference/ve/, https://ermongroup.github.io/cs228-notes/inference/sampling/.

The NLP part of the course is heavily based on the NLP course(s) from Dan Jurafsky (Stanford), following his book: Speech and Language Processing (see NLP above) - particularly its 3rd edition draft (2nd ed. is insufficient!). The relevant chapters for us are 3, 6, 7 and 9. There are also some nice related materials and videos

For the COLT part: besides the monograph by Kearns et al linked above, the Wikipedia page has pointers to two COLT survey papers (Angluin, Haussler) which are relevant to the PAC part. There are also external courses with lecture material available; for example, 8803 Machine Learning Theory at Georgia Tech covers all COLT topics of SMU (there are subtle differences in the algorithms and proofs). Video footage of the lectures available here.


Lecture 1 - Reinforcement Learning 1

Lecture 2 - Reinforcement Learning 2

Slides: lecture_2.pdf

Videos: Introduction, Statistical Properties of Estimators, Monte Carlo Value Evaluation. Coming soon: Temporal Difference Learning


Lecture 3 - Reinforcement Learning 3

Lecture 4 - Reinforcement Learning 4

Slides: lecture_4.pdf


Lecture 5 - Reinforcement Learning 5

Slides: lecture_5.pdf


Lecture 6 - Bayesian Networks 1

Slides: slides

Video: Video


Lecture 7 - Bayesian Networks 2

Lecture 8 - Natural Language Processing 1

Probabilistic models: slides

video: google-drive dropbox


Lecture 9 - Natural Language Processing 2

Vector models: slides

video: google-drive dropbox


Lecture 10 - Natural Language Processing 3

Neural models: slides

video: google-drive dropbox


Lecture 11 - Computational Learning Theory 1

Lecture 12 - Computational Learning Theory 2

Lecture 13 - Computational Learning Theory 3

courses/smu/lectures.txt · Last modified: 2022/06/05 19:51 by zelezny