Not logged in. Login

CMPT 727 G1 Spring 2024

CMPT 727: Statistical Machine Learning. Spring 2024.

Course logistics

Schedule:

  • Mondays 2:30-4:20pm: In-person session, AQ4130.
  • Wednesdays 2:30-3:20pm: Remote lecture-watching session.

Video lectures. Lectures will be added to this Youtube playlist. Last year's lectures. Lecture slides (may not match the videos exactly).

Zoom link: https://sfu.zoom.us/j/81554886732?pwd=cz7AKL227CENstsHgibCCvjmxwnKWO.1. Passcode = 448387 .

Discord discussion server.

Instructor: Maxwell Libbrecht

  • Web page
  • Office: TASC 1 9219
  • Email: maxwl at sfu dot ca

TA:

  • Neda Shokraneh Kenary: nshokran at sfu dot ca
  • Office hours: TBD.

Textbooks:

  1. PMLAI: "Probabilistic Machine Learning: An Introduction" by Kevin Patrick Murphy. Please use the version linked here, not the link on the textbook's webpage, in order to match the section numbers here.
  2. PMLAT: "Probabilistic Machine Learning: Advanced Topics" by Kevin Patrick Murphy.
  3. Previous textbook, no longer used: MLPP: "Machine Learning: a Probabilistic Perspective" by Kevin Patrick Murphy.

Group members for group assignments.

Schedule

Week 1 (Jan 8)

  • Reading: PMLAI Ch1.
  • Lectures: Recorded lecture 1 (see link at top of page), in-person introductory lecture (see recording if you missed it).
  • Assignment 1 (Due Wednesday Jan 10, 11:59pm): This is a qualifying assignment; based on your performance on the assignment, we will recommend whether you have the prerequisites for the course. (For the purposes of the course grade, everyone will get full credit, so please represent your knowledge accurately.) You should expect to consult reference materials (e.g. textbooks, Wikipedia, etc), but if you find you need to re-read whole textbook chapters, that is a sign that you are missing the prerequisite knowledge.

Week 2 (Jan 15)

Week 3 (Jan 22):

  • Reading: PMLAI Ch3.1-3 (Univariate probabilisitic models). PMLAI Ch3.5-8 (Multivariate and linear Gaussian, probabilistic graphical models).
  • Lectures: 5-8.
  • Assignment 3 (Due Wed Jan 24).

Week 4 (Jan 29):

  • Reading: Review PMLAI Appendix B. PMLAI 4.1-4. PMLAI 8.1.
  • Lectures: 9.1, 9, 10, 18
  • Assignment 4 (Due Wed Jan 31)

Week 5 (Feb 5):

  • Reading: PMLAI 5.1-5.
  • Lectures: 11-14.
  • Assignment 5 (Due Wed Feb 7)

Week 6 (Feb 12):

  • Reading: PMLAI 5.7-8.
  • Lectures: 15.1-17.
  • Assignment 6 (Due Wed Feb 14)

(Week of Feb 19: off for Reading Break)

Week 7 (Feb 26):

Week 8 (Mar 4):

  • Reading: PMLAI Ch11.1-5.
  • Lectures: 24-28.
  • (No written assignment this week.)
  • Midterm exam: During class, March 4. The midterm will be similar to the assignments: there will be 2-4 conceptual questions (not multiple choice). The midterm will be open-book, open-internet, no collaboration. Solutions can be in any legible format. The scope of the midterm is any content of the course so far, including this week's reading/lectures.

Week 9 (Mar 11):

  • Reading: PMLAT Ch4.1-3,4.5 (Bayesian networks, Markov random fields). Note the switch to the second textbook (see the link above).
  • Lectures: 29-32.
  • Assignment 7 (Due Wed Mar 13)

Week 10 (Mar 18):

  • Reading: PMLAT Ch7 (Inference overview), Ch9.1-3 (Exact inference, belief propagation), Ch9.5 (Variable elimination).
  • Lectures: 33-35.
  • Assignment 8 (Due Wed Mar 20)
  • Programming assignment 2 (Due Wed Apr 24)

Week 11 (Mar 25):

  • Reading: Reading: PLMAT 9.4 (Loopy BP). 9.6 (Junction tree algorithm). 11.1-4 (Rejection sampling).
  • Lectures: 35.1-36.
  • Assignment 9 (Due Wed Mar 27)

Week 12 (Apr 1): SFU closed Monday for Easter. No assignment. We will have Wednesday remote session as usual.

Week 13 (Apr 8): Last week of classes. Wednesday class will be a wrap-up lecture and Q&A.

  • Reading: PMLAT 12.1-3 (Gibbs sampling, MCMC). 12.6 (Practical MCMC).
  • Lectures: 37-40, summary+review.
  • Assignment 10 (Due Wed Apr 10)
  • Please complete your course experience survey.

Final exam: Wednesday April 17, 12pm-3pm in AQ 5005. The format will be the same as the midterm. There will be 3-8 conceptual questions (not multiple choice). The content will be from the whole semester. The exam will be open-book, open-internet, no collaboration. Solutions can be in any legible format.

Final programming assignment. Due Wed Apr 24.

Course information

Unofficial prerequisites: No official prerequisites. However, the course assumes knowledge of machine learning (CMPT 726), probability and linear algebra (MATH 240). The Assignment 1 is a qualifying assignment to show whether you have the necessary background.

The course is open to advanced undergraduates with permission. How to enroll as an undergraduate: Please follow the following steps: (1) Come to the first lecture, where I will explain the expected background. (2) Email me stating that you have taken a course in probability/statistics and a course in machine learning (CMPT 726). If it is appropriate, I will respond with my approval. (3) Fill out the prerequisite waiver form and attach a screenshot of our email exchange.

Assignments. There will be one written assignment per week, due on Monday at midnight in the time zone of your choice. Assignments will be due in two parts. Your first submission is individual, worth 2/3 of the grade. The second submission, due one week later, is a joint group submission, worth 1/3 of the grade. This gives you a chance to fix any mistakes as a group. (Group grade is the maximum of the individual and group submission.) No late submissions will be accepted.

Grading breakdown : 45 written assignments; 10 midterm; 15 final exam; 20 programming assignments; 10 participation (2 free missed participation days).

Overleaf. A good free web-based Latex editor.

Collaboration policy: You may freely discuss the problem sets and coding assignments with other students. All writing must be your own; it is not acceptable to copy/paste or verbatim transcribe others' text, code or LaTeX source.

Other good resources:

  • link. Bishop "Pattern Recognition and ML".
  • link. Trevor Hastie, Robert Tibshirani, Jerome Friedman. "Elements of Statistical Learning".
  • link. Wasserman "All of Statistics".
  • link. Daphne Koller and Nir Friedman. "Probabilistic Graphical Models: Principles and Techniques".

Last year's web page

Updated Wed April 17 2024, 11:57 by maxwl.