Not logged in. Login

CMPT 727 G1

CMPT 727: Statistical Machine Learning. Spring 2023.

Course logistics

Schedule:

  • Optional remote session Wednesdays 1:30PM - 2:20PM.
  • In-person session Fridays 12:30PM - 2:20PM, WMC 2200.

Video lectures. Lectures will be added to this Youtube playlist. Last year's lectures. Lecture slides (may not match the videos exactly).

Zoom link: https://sfu.zoom.us/j/88516093511?pwd=NjBHYVdiTXZnTnpYQWFTSjlvNFpJQT09. Passcode = 927339 .

Discord discussion server: https://discord.gg/byTGwQZs

Instructor: Maxwell Libbrecht

  • Web page
  • Office: TASC 1 9219
  • Email: maxwl at sfu dot ca

TA:

  • Neda Shokraneh Kenary: nshokran at sfu dot ca
  • Office hours: Mondays 3-4pm, TASC1 9010.

Group assignments.

Textbooks:

  1. PMLAI: "Probabilistic Machine Learning: An Introduction" by Kevin Patrick Murphy. Please use the version linked here, not the link on the textbook's webpage, in order to match the section numbers here.
  2. PMLAT: "Probabilistic Machine Learning: Advanced Topics" by Kevin Patrick Murphy.
  3. Previous textbook, no longer used: MLPP: "Machine Learning: a Probabilistic Perspective" by Kevin Patrick Murphy.

Schedule

Week 1 (Jan 2)

  • Reading: PMLAI Ch1.
  • Group assignment survey: link
  • Assignment 1 pdf and tex (Due Monday, Jan 9, 11:59pm): This is a qualifying assignment; based on your performance on the assignment, we will recommend whether you have the prerequisites for the course. (For the purposes of the course grade, everyone will get full credit, so please represent your knowledge accurately.) You should expect to consult reference materials (e.g. textbooks, Wikipedia, etc), but if you find you need to re-read whole textbook chapters, that is a sign that you are missing the prerequisite knowledge.

Week 2 (Jan 9)

Week 3 (Jan 16):

  • Reading: PMLAI Ch3.1-3 (Univariate probabilisitic models). PMLAI Ch3.5-8 (Multivariate and linear Gaussian, probabilistic graphical models).
  • Lectures: Recorded lectures 5-8.
  • Assignment 3 pdf (due Jan 23)

Week 4 (Jan 23):

  • Reading: PMLAI Ch A.3, B.1-3 (Matrix calculus review). PMLAI Ch4.1-4 (Parameter estimation, regularization).
  • Lectures: Recorded lectures 9-10.
  • Assignment 4 pdf (due Jan 30)

Week 5 (Jan 30):

  • Reading: PMLAI Ch 5.1-5 (Optimization).
  • Lectures: 11-14.
  • Assignment 5 pdf (due Feb 6).

Week 6 (Feb 6):

  • Reading: PMLAI Ch 5.7-8 (Latent variables and EM). PMLAI Ch 8.1 (Decision theory).
  • Lectures: 15-18.
  • Assignment 6 pdf (due Feb 13).

Week 7 (Feb 13):

  • Reading: PMLAI 9.1-4, 10.1-2, 10.4-5 (Linear models: Naive Bayes, Gaussian discriminant, logistic regression.)
  • Lectures: 19-23.
  • Programming assignment 1 pdf (due Feb 27)
  • (No written assignment this week).

(Week of Feb 20: Off for reading break.)

Week 8 (Feb 27):

  • Reading: PMLAI 11.1-5 (Linear regression, LASSO).
  • Lectures: 24-28.
  • Assignment 7 pdf (due Mar 6).

Week 9 (Mar 6):

  • Reading: PMLAT Ch4.1-3,4.5 (Bayesian networks, Markov random fields). Note the switch to the second textbook (see the link above).
  • Lectures: 29-32.
  • Midterm exam: During class, March 10. The midterm will be similar to the assignments: there will be 2-4 conceptual questions (not multiple choice). The midterm will be open-book, open-internet, no collaboration. Solutions can be in any legible format.
  • (No written assignment this week.)

Week 10 (Mar 13):

  • Reading: PMLAT Ch7 (Inference overview), Ch9.1-3 (Exact inference, belief propagation), Ch9.5 (Variable elimination).
  • Lectures: 33-35.
  • Assignment 8 pdf (due Mar 20).

Week 11 (Mar 20):

  • Reading: Reading: PLMAT 9.4 (Loopy BP). 9.6 (Junction tree algorithm). 11.1-4 (Rejection sampling).
  • Lectures: 35.1-36.
  • Assignment 9 pdf (due Mar 27).

Week 12 (Mar 27):

  • Reading: PMLAT 12.1-3 (Gibbs sampling, MCMC). 12.6 (Practical MCMC).
  • Lectures: 37-40.
  • In-class exercise (nothing to turn in): pdf.

Week 13 (Apr 3): Last day of class is Wed Apr 5. This will be a summary and review session over Zoom. It will be recorded for those who cannot attend in person.

Final exam: Monday, April 17, 12:30-3:30pm in AQ 3003. The format will be the same as the midterm. There will be 3-8 conceptual questions (not multiple choice). The exam will be open-book, open-internet, no collaboration. Solutions can be in any legible format.

Final programming project: pdf (due Apr 24).

Course information

Unofficial prerequisites: No official prerequisites. However, the course assumes knowledge of machine learning (CMPT 726), probability and linear algebra (MATH 240). The Assignment 1 is a qualifying assignment to show whether you have the necessary background.

The course is open to advanced undergraduates with permission. How to enroll as an undergraduate: Please follow the following steps: (1) Come to the first lecture, where I will explain the expected background. (2) Email me stating that you have taken a course in probability/statistics and a course in machine learning (CMPT 726). If it is appropriate, I will respond with my approval. (3) Fill out the prerequisite waiver form and attach a screenshot of our email exchange.

Assignments. There will be one written assignment per week, due on Monday at midnight in the time zone of your choice. Assignments will be due in two parts. Your first submission is individual, worth 2/3 of the grade. The second submission, due one week later, is a joint group submission, worth 1/3 of the grade. This gives you a chance to fix any mistakes as a group. (Group grade is the maximum of the individual and group submission.) No late submissions will be accepted.

Grading breakdown (out of 105): 50 written assignments; 10 midterm; 15 final exam; 20 programming assignments; 10 participation (2 free missed participation days).

Overleaf. A good free web-based Latex editor.

Collaboration policy: You may freely discuss the problem sets and coding assignments with other students. All writing must be your own; it is not acceptable to copy/paste or verbatim transcribe others' text, code or LaTeX source.

Other good resources:

  • link. Bishop "Pattern Recognition and ML".
  • link. Trevor Hastie, Robert Tibshirani, Jerome Friedman. "Elements of Statistical Learning".
  • link. Wasserman "All of Statistics".
  • link. Daphne Koller and Nir Friedman. "Probabilistic Graphical Models: Principles and Techniques".

Last year's web page

Updated Thu March 30 2023, 15:35 by nshokran.