Not logged in. Login

Graph Representation Learning - CMPT 983 G1

Fall Semester 2021

Simon Fraser University

Instructor: Oliver Schulte

Breadth Area III

Course Logistics

Office Location: TASC 1 9021.

Office Phone: 778-782-3390.

Office Hours: Friday 1 pm-1:30 pm in person, 1:30 pm-2 pm on zoom.

E-mail Office Hour: Friday 9:30 am - 10:30 am.

TA: Kiarash Zahirnia firstname_lastname@sfudotca

TA Office Hour: Monday 4:00 pm - 5:00 pm. ZOOM: https://us04web.zoom.us/j/6395434556?pwd=VFRjS3Z5L0EyVkRJZ2VNb3pmd21ZQT09 Meeting ID: 639 543 4556 Passcode: n3FLLu

Email: myfirstname_mylastname@sfudotca

Announcements

  • Please remember to wear a mask to class

Course Information

Plan for the Course

The course has three parts.

  1. Background. Getting on the same page about general machine learning, deep learning. This should take 1-2 weeks at the most.
  2. Introduction to Graph Learning. Mainly based on the textbook. About 6 weeks.
  3. Discussion of Advanced Topics and Projects. About 6 weeks.

Information for Topic Presentations

This is a seminar course, so students are expected to cover part of the course material in presentations and discussions. Every student must contribute one presentation about a graph learning topic. Depending on the course progress, there may be an option to give two presentations. More details about the topic presentation.

Information for Projects

You have the option of completing a project as a group, and I encourage you to do so, for the following reasons.

  1. Discussing topics with other students will help you understand them.
  2. Having help with the project will lead to a better grade.
  3. You can tackle a more ambitious projects.

A group should have at most 3 members. Each project group will contribute two presentations.

  1. A project outline presentation for your course project: introduce the problem, the dataset, describe your plan of attack. More details here.
  2. A final project presentation about the results of your course project. More details here.

Tentative Course Schedule

  • Week 1. September 10. Introduction. Slides
  • Week 2. September 14 and 16. Background.
  • Week 3. September 21. Introduction Text Chapter 1.
    • Definition of Graph Data
    • Graph Learning Tasks
  • Week 4 Sep 28. Traditional Methods. Chapter 2. Discussion Slides (not a substitute for reading).
    • Graph Statistics
    • Spectral Methods
  • Week 5 Oct 5. Traditional Methods Ctd.
  • Week 6 Oct 12. Node Embeddings.
  • Week 7 Oct 19. Node Embeddings ctd. Graph Neural Networks. Chapter 5.
    • Random Walk Methods (node2vec). Kumar.
    • Graph Neural Networks. Erfaneh
  • Week 8. Oct 26. Ch.5 ctd. Graph Neural Networks in Practice. Chapter 6.
    • Graph learning packages (Kiarash). Zoom Recording. Passcode: $686Cj2C
    • Graph Neural Networks in Practice. Chapter 6. Shuman
    • Graph Convolutional Networks(BCNet). Sebastian
  • Week 9. Nov 2. Generative Graph Models.
    • "Traditional" Generative Models. Chapter 8
    • Graph VAEs. Chapter 9
    • Deep Generative Models for Graphs. Chapter 9. Qiushi Friday Nov 5.
    • Graph Transformer Networks - Sachini Tuesday Nov 9.
  • Week 10. Nov 9. Miscellaneous
    • Overflow.
    • Optional Topic Presentations.
    • Some students may want to do a project outline presentation.
    • Representation Learning for Dynamic Graphs. Abdolreza Nov 12
  • Week 11. Nov 16. Project Presentations. Overflow/Optional Topics. - Neda. Expressive power of Graph Neural Networks.
    • Graph Attention Networks. Akash Sindhu. Nov 12.
  • Week 12. Nov 23. Project Presentations continued. Overflow/Optional Topics.
    • Project outline presentation. Neda & Shuman. Nov 23.
  • Week 13. Nov 23, 26. No class (project presentations next week)
  • Week 14.
    • Dec 10. 9:30-12 pm (Session I) and 12:30-2:30 pm (Session II). Final Project Presentations. ASB 9921.

Further Topics

Once we have acquired basic background, we can move to more specific topics. I will seek out class input into further topics. Sample options include the following.

  • Inductive Graph Learning (generalizing from observed nodes to new nodes)
  • Applications, e.g. drug discovery, natural language processing
  • Tools and Packages
  • Current Research papers from conferences (e.g. Neurips, IJCAI, ICLR).
  • Competitions involving graph data (e.g. Kaggle, Open Graph Benchmark)
  • Theoretical Motivations for graph neural nets (Chapter 7).
  • graph attention networks
  • Graph Representation Learning for Dynamic Graphs.

Project Outline Presentations

  1. Student Student. Nov 16. Inductive Graph Learning.
  2. Neda & Shuman. Nov 23. Evaluating Pre-trained Graph Models.

Resources

  • Recent book on deep learning. By Bengio, Goodfellow, and Courville. Covers many topics, a good reference to get a quick idea on what a deep learning approach to a machine learning problem would be.
Updated Tue Dec. 21 2021, 23:48 by ema61.