Not logged in. Login

CMPT 829 G2

CMPT 829 G2: Seminar on Applications of Deep Neural Networks.

Simon Fraser University, Burnaby Campus. Spring 2018.

MWF 12:30-1:20pm. Room: TASC2 8500

Course logistics

Instructor: Maxwell Libbrecht

  • Web page
  • Office: TASC 1 9219
  • Email: maxwl at sfu dot ca
  • Office hours: Wednesdays 1:30-2:30pm.

Announcements

  • 2018-01-03: Welcome to CMPT 829 G2!

Course schedule

Schedule

The schedule is editable. If you would like to be on someone's committee, please add yourself. If you want to either (1) drop the class and cancel your talk or (2) swap your talk time with someone else (or an empty day), please email me and edit the doc appropriately.

Evaluation form

Talk evaluation form.

Course information

  • This is a seminar-style course focused on deep neural networks. The primary class activity will be students giving talks on recent papers.
  • A primary focus of the class is on building presentation skills.
  • Grading: 50% talk. 20% committee participation. 30% in-class participation.
  • Peer system: Each speaker will get a "committee" of three other students, who will help the speaker improve their talk. The committee will have one "primary" member who will read the paper and attend the practice talk and two "secondary" members who will just attend the practice talk.
  • Deliverables (upload all deliverables to Google Drive, as described below):
    • [Due 21 days before talk, by speaker] Choose paper. Write 1 page summarizing paper and why it is relevant for the class.
    • [Due 14 days before talk, by speaker] Make draft slides.
    • [Due 7 days before talk, by primary committee member] Primary reads paper. Speaker and primary meet 7-14 days before talk, discuss paper and slides. Primary writes 1 page summarizing discussion and giving feedback to primary.
    • [Due 2 days before talk, by all committee members] Speaker gives practice talk to all three committee members (it's up to the speaker to reserve a room). Each committee member fills out an evaluation form.
  • How to choose a paper:
    • A paper should (1) be related to deep learning, either an application or novel method; (2) be published in the last three years; and (3) be of interest to the class, meaning that it is understandable to a computer scientist without knowledge of the application area and the deep learning methods are a primary focus of the paper.

How to submit deliverables

Google Drive submission directory

  • Open the Google Drive submission directory, linked above.
  • Log in to your Google account. Any account will do; if you don't have one already, you can create one for free.
  • Click on the folder corresponding to the *speaker* this deliverable corresponds to. (If you are the speaker, go to your own directory; if you are a committee member, go to the speaker's directory.) If the directory does not exist yet, create it using the New button in the top left.
  • Upload your deliverable with the following name: "<your_name> - <deliverable_name>.<extension>". For text deliverables, you can use a googledoc rather than a file from your computer. Deliverable names: "Speaker paper summary"; "Speaker draft slides"; "Primary draft slide feedback"; "Practice talk feedback".
  • For example, if my name is Adam, I am Bob's primary and I am submitting the feedback from the one-on-one draft slides meeting, I would name my file "Adam - Primary draft slide feedback" and put it in the directory named "Bob".

How to choose a paper

Criteria:

  • Published in the last several years (three or so). Papers published just in arXiv are fine.
  • Not from SFU.
  • Deep learning is a central focus of the paper.
  • At least part of the paper is interesting to the audience of this seminar. The best way to make a paper interesting to this audience is to analyze why the authors' problem was amenable to deep learning, and why the authors' architecture choices were the right ones.

Good places to look for papers:

  • Search your field's journals for "deep learning", "neural network" and the like.
  • Search machine learning venues, like NIPS and ICML.
  • Look at the recently published papers from groups that work on deep learning.
  • Ask your advisor for ideas.

Talk study questions

Committee members: Use these questions to help give feedback on a speaker's practice talk and outline.

  • Slides:
    • Is each slide headed by a full sentence expressing the take-home message of that slide?
    • Are the graphics and font (including figure axes and legends) on each slide large enough to be legible?
    • Has all extraneous information been removed from each slide?
    • Do each slide's elements appear as they are needed by the speaker?
  • Organization:
    • Does the talk have a home slide/image that helps the audience reorient themselves?
    • Is the talk separated into distinct episodes or "data dives" that are independently understandable?
  • Introduction:
    • Is the talk of interest to a general CS/ML audience?
    • Does the introduction gain attention and interest?
    • Does the introduction give sufficient background for a nonspecialist audience?
  • Content:
    • Is the level of detail appropriate?
    • Is the amount of content appropriate for a 40 minute talk?
    • Notation: Is mathematical notation defined clearly, and redefined with each use?
    • Does the talk focus on the key points of the paper?
  • Analysis:
    • Does the talk put the paper in context of previous work?
    • Does the talk investigate not just *what* the authors did but *why* they did it?
    • Does the speaker critically analyze the correctness and impact of the paper?
    • Does the speaker explain the paper's implications, caveats and future work?
  • Speech:
    • Is the speaker in control of his/her voice? Consider enunciation, projection (loud/soft), rate of speech, use of pauses and presence of verbal ticks ("um", "like").
    • Does the speaker have appropriate rapport with the audience? Consider eye contact, body language.

Deep learning background materials

Here are some materials if you want to brush up on your deep learning background:

Paper suggestions

To get you started, I listed some suggestions of my own below. Because of my background, they are heavily biased towards computational biology; you will have to ask others to find deep learning papers in other fields.

  • Semi and Weakly Supervised Semantic Segmentation Using Generative Adversarial Network. https://arxiv.org/pdf/1703.09695.pdf Computational biology
  • Basset: Learning the regulatory code of the accessible genome with deep convolutional neural networks. David R Kelley, Jasper Snoek and John Rinn.
  • Nucleotide sequence and DNaseI sensitivity are predictive of 3D chromatin architecture. Jacob Schreiber, Maxwell Libbrecht, Jeffrey Bilmes, William Noble. https://doi.org/10.1101/103614
  • Reverse-complement parameter sharing improves deep learning models for genomics. Shrikumar A, Greenside P, Kundaje A.
  • Not Just a Black Box: Learning Important Features Through Propagating Activation Differences. Shrikumar A, Greenside P, Shcherbina A, Kundaje A.
  • DFIM: Deep Feature Interaction Maps uncover latent dependence structure encoded in deep learning models of regulatory DNA sequences. Greenside PG, Shimko T, Fordyce P, Kundaje A.

Groups that work on deep learning:

  • Geoffrey Hinton (methods)
  • DeepMind
  • Google Brain
  • Allen Institute for Artificial Intelligence (NLP)
  • Brendan Frey (computational biology)
  • Anshul Kundaje (computational biology)

Conferences/journals:

  • Neural Information Processing Systems (NIPS)
  • International Conference on Machine Learning (ICML)
  • International Conference on Learning Representations (ICLR)
  • Journal of Machine Learning Research (JMLR)
Updated Thu Feb. 15 2018, 14:44 by maxwl.