Structures are pervasive in science and engineering. Some structures are conveniently observable, e.g., 3D point clouds, molecules, phylogenetic trees, social networks, whereas some are latent or hard to be measured, e.g., parse trees for languages/images, causal graphs, and latent interactions among actors in multi-agent systems. Advanced deep learning techniques have emerged recently to effectively process data in the above scenarios.
This course will teach cutting-edge deep learning models and methods with structures. In particular, for observable structures, we will introduce popular models, e.g., Transformers, Graph Neural Networks, with an emphasis on motivating applications, design principles, practical and or theoretical limitations, and future directions. For latent structures, we will introduce motivating applications, latent variable models (e.g., variational auto-encoders), and inference methods (e.g., amortization and search), and learning methods (e.g., REINFORCE and relaxation).
Instructor | Renjie Liao |
---|---|
TA | Muchen Li |
Section 1 | 12:00pm to 1:30pm, Monday |
Section 2 | 12:00pm to 1:30pm, Wednesday |
Piazza | https://piazza.com/ubc.ca/winterterm22022/eece571f2072021w2 |
Office Hour | 3:00pm to 4:00pm, Tuesday |
Location | Wesbrook 201 |
rjliao@ece.ubc.ca |
The instructor will present the lectures every week except that students will present their projects in the last two weeks.
Students should ask all course-related questions on Piazza.
Students can work on projects individually, or in groups of up to four (group should be formed as early as possible). Students are strongly encouraged to form groups via, e.g., discussing on Piazza. However, a larger group would be expected to do more than a smaller one or individuals. All students in a group will receive the same grade. Students are allowed to undertake a research project that is related to their thesis or other external projects, but the work done for this course must represent substantial additional work and cannot be submitted for grades in another course.
The grade will depend on the quality of research ideas, how well you present them in the report, how clearly you position your work relative to prior literature, how illuminating and or convincing your experiments are, and well-supported your conclusions are. Full marks will require a novel contribution.
Each group of students will write a short (>=2 pages) research project proposal, which ideally will be structured similarly to a standard paper. You don’t have to do exactly what your proposal claims - the point of the proposal is mainly to have a plan for yourself and to allow me to give you feedback. Students will do a short presentation (roughly 5 minutes for individual, 10 to 15 minutes for a larger group) for their projects towards the end of the course. At the end of the class, every group needs to submit a project report (6~8 pages).
Grades will be based on:
All reports (i.e., paper reading report, proposal, peer-review report, and final project report) must be written in NeurIPS conference format and must be submitted as PDF
Late work will be automatically subject to a 20% penalty and can be submitted up to 3 days after the deadline
UBC values academic integrity. Therefore, all students must understand the meaning and consequences of cheating, plagiarism and other academic offences under the Code of Student Conduct and Discipline.
It is the responsibility of each student to understand the policy for each course work, ask questions to the instructor if it is not clear, and carefully acknowledge all sources (papers, code, books, websites, individual communications) using appropriate referencing style when submitting work.
This is a tentative schedule, which will likely change as the course goes on.
# | Dates | Lecture Topic | Lecture Slides | Suggested Readings |
---|---|---|---|---|
1 | Jan. 10 Jan. 12 |
Introduction to Deep Learning | slides, zoom | Chapter 13, 14 of PML book & DL book |
2 | Jan. 17 Jan. 19 |
Supervised Deep Learning with Observable Structures I Invariance, Equivariance, and Deep Learning Models for Sets/Sequences |
slides, zoom | DeepSets & Transformers & Chapter 15 of PML book |
3 | Jan. 24 Jan. 26 Jan. 31 |
Supervised Deep Learning with Observable Structures II Graph Neural Networks: Message Passing Models |
slides, zoom | Part II of GRL book & Chapter 23 of PML book & Chapter 4 of GNN book & GNNs & GGNNs & GAT |
4 | Feb. 2 Feb. 7 Feb. 9 |
Supervised Deep Learning with Observable Structures III Graph Neural Networks: Graph Convolution Models |
slides, zoom | Part II of GRL book & Chapter 23 of PML book & Chapter 4 of GNN book & GCNs & ChebyNet & LanczosNet |
5 | Feb. 14 Feb. 16 |
Unsupervised Deep Learning with Observable Structures I Deep Generative Models of Graphs: Auto-Regressive Models |
slides, zoom | Chapter 11 of GNN book & DGMG & GraphRNN & GRAN |
6 | Feb. 21 | Unsupervised Deep Learning with Observable Structures II Self-supervised Representation Learning |
Slides and zoom recording are only available on Piazza | Guest Lecture by Dr. Ting Chen SimCLR & SimCLRv2 |
7 | Mar. 2 | Unsupervised Deep Learning with Observable Structures III Deep Generative Models of Graphs: VAEs and GANs |
slides, zoom | VGAE & GraphVAE & JunctionTreeVAEs & MolGANs |
8 | Mar. 7 | Unsupervised Deep Learning with Observable Structures IV Unsupervised/Self-supervised Graph Representation Learning |
slides, zoom | DeepWalk & DeepGraphInfomax |
9 | Mar. 9 | Theory of GNNs Expressiveness & Generalization of Graph Neural Networks |
slides, zoom | GIN & PAC-Bayes Bounds |
10 | Mar. 14 Mar. 16 Mar. 21 |
Deep Learning with Latent Structures I Discrete Latent Variable Models (RBMs) & Contrastive Divergence & Amortized Inference & REINFORCE & Variance Reduction & Reparameterization & Wake-Sleep Algorithm |
slides, zoom | RBMs & CD & NVIL & VAE & Wake-Sleep |
11 | Mar. 23 | Deep Learning with Latent Structures II Stochastic Gradient Estimation |
slides, zoom | Straight-through Estimator & Gumble-Softmax & Gumble-TopK |
12 | Mar. 28 | Deep Learning with Latent Structures II Stochastic Gradient Estimation & Learning Discrete Probabilistic Models |
Slides and zoom recording are only available on Piazza | Guest Lecture by Dr. Will Grathwohl RELAX & Oops I Took A Gradient |
13 | Mar. 30 | Deep Learning with Latent Structures III Learning Latent Graph Structures |
slides, zoom | NRI & Learning Discrete Structures for GNNs |
14 | Apr. 4 Apr. 6 Apr. 11 |
Project Presentation | zoom |
I am very open to auditing guests if you are a member of the UBC community (registered student, staff, and/or faculty). I would appreciate that you first email me. If the in-person class is too full and running out of space, I would ask that you please allow registered students to attend.
While there is no required textbook, I recommend the following closely relevant ones for further reading:
I also recommend students who are self-motivated to take a look at similar courses taught at other universities: