Not logged in. Login

Course Projects CMPT 880 - Deep Learning

Grading Criteria and Submission

Marking Scheme:

  • Presentation. 30%. Clarity, conciseness:quality of exposition.
  • Originality. 40%. To what extent were you creative in developing your own ideas?
  • Evaluation, Methodology. 30%.

I will add further clarifications as required by the students. Submission Items:

  • The main item is the final presentation on the designated final presentation day. I will accept two formats, please choose one:
  1. In-class presentation, similar to the topic/outline presentation.
  2. Poster at the final poster session.
  • In addition to the group's final presentation, each student should submit a reflection on group work experience (around half a page). This will be part of their final presentation grade.
  • Optionally a group or a subset of the students can submit a written report (at most 8 pages single-column, 4 pages double-column).

Topic Suggestions

Here are some suggestions for course projects. Feel free to suggest your own. See the section on Additional Readings. You should discuss the topic with me before you start in any case.

The typical project would evaluate a deep learning method on an interesting problem of your choice. "Interesting" means some mix of real-world importance and learning complexity. I assume that many students will already have a learning problem that they are working on, so it is natural to try deep learning methods. I'm also open to other types of projects, for instance surveys or theoretical analysis. Below I list example topics.

Compare with other models for learning latent features

  1. PCA,  kernelized PCA (non-linear).
  2. Neural nets.
  3. Hierarchical Clustering, Multiple Clustering.
  4. SVMs: can we think of deep learning as learning a kernel? How do deep models compare to SVMs with latent variables?
    • Consider the experiments and arguments in "Scaling Learning Towards AI".
      • How does a Deep Belief net perform in learning parity? The sine function?
      • The paper reports a dataset where a convolutional neural net performs much better than an SVM with a local kernel. How about a DBN vs. SVM? Can you find a dataset with the reverse performance, where an SVM outperforms a DBN? If not, why not? What would such a dataset be like?

Compare alternative training Methods for deep neural netss

  1. EM-style?
  2. gradient descent, e.g. conjugate version.
  3. second-order methods, e.g. derived from Newton's method.
  4. a version of Iterative Reweighted Least Squares
  5. use max-margin as a loss function

Compare different deep learning Models

  1. Number of layers.
  2. Different types of models, e.g.
    1. Logistic Regression
    2. Multilayer perceptron
    3. Deep Convolutional Network
    4. Auto Encoders, Denoising Autoencoders
    5. Stacked Denoising Auto-Encoders
    6. Restricted Boltzmann Machines
    7. Deep Belief Networks

Apply Deep Learning

  1. NLP
  2. Vision
  3. Multimodal (see tutorials)
  4. Transfer Learning.
    • Do DBNs achieve the invariance and feature abstraction discussed in the paper? Consider this type of experiment: first learn a DBN that represents the sine function. Then use the weight settings as initial settings for learning the cosine function.

Surveys

  • Literature Review
  • Theoretical Analysis
    • We discussed the equivalence between regularlized neural net learning with 1 hidden layer vs. support vector machine learning. Is it possible to extend this to deep neural nets with several hidden layers?
Updated Mon April 08 2019, 13:44 by oschulte.