
Advanced deep learning techniques have revolutionized the field, enabling remarkable progress across various applications. This course will provide a comprehensive understanding of the latest models and methods that are shaping the future of deep learning, with a particular focus on probabilistic and geometric deep learning, and deep reinforcement learning.
We will cover following advanced topics:
Geometric Deep Learning: Graph Neural Networks (GNNs), Transformers, Group Equivariant Networks.
Probabilistic Deep Learning: Large Language Models (LLMs), VAEs, Flow Models, Diffusion Models.
Deep Reinforcement Learning: Policy gradient methods.
Students should ask all course-related questions on Piazza. We will use Canvas to handle submission and evaluation of all reports and project related files.
| Instructor | Renjie Liao |
|---|---|
| TAs | Yuanpei Gao |
| Section 1 | 1:30pm to 3:00pm, Monday |
| Section 2 | 1:30pm to 3:00pm, Wednesday |
| Location | Section 1: Room 116, Hebb Building (HEBB) |
| Location | Section 2: Room 103, Chemical and Biological Engineering Building (CHBE) |
| Piazza | https://piazza.com/ubc.ca/winterterm22025/eece571f |
| Office Hour | 1:00pm to 2:00pm, Tuesday, KAIS 3047 (Ohm) |
| renjie.liao@ubc.ca |
Grades will be based on:
Students can work on projects individually, or in groups of up to four (group should be formed as early as possible). Students are strongly encouraged to form groups via, e.g., discussing on Piazza. However, a larger group would be expected to do more than a smaller one or individuals. All students in a group will receive the same grade. Students are allowed to undertake a research project that is related to their thesis or other external projects, but the work done for this course must represent substantial additional work and cannot be submitted for grades in another course.
The grade will depend on the quality of research ideas, how well you present them in the report, how clearly you position your work relative to prior literature, how illuminating and or convincing your experiments are, and well-supported your conclusions are. Full marks will require a novel contribution.
Each group of students will write a short (>=2 pages) research project proposal, which ideally will be structured similarly to a standard paper. You don’t have to do exactly what your proposal claims - the point of the proposal is mainly to have a plan for yourself and to allow me to give you feedback. Students will do a short presentation (roughly 5 minutes for individual, 10 to 15 minutes for a larger group) for their projects towards the end of the course. At the end of the class, every group needs to submit a project report (6~8 pages).
All reports (i.e., paper reading report, proposal, peer-review report, and final project report) must be written in NeurIPS conference format and must be submitted as PDF
Late work will be automatically subject to a 20% penalty and can be submitted up to 3 days after the deadline
UBC values academic integrity. Therefore, all students must understand the meaning and consequences of cheating, plagiarism and other academic offences under the Code of Student Conduct and Discipline.
It is the responsibility of each student to understand the policy for each course work, ask questions to the instructor if it is not clear, and carefully acknowledge all sources (papers, code, books, websites, individual communications) using appropriate referencing style when submitting work.
This is a tentative schedule, which will likely change as the course goes on.
| # | Dates | Lecture Topic | Lecture Slides | Suggested Readings |
|---|---|---|---|---|
| 1 | Jan. 5 | Introduction to Deep Learning | slides | Chapter 13, 14 of PML1 book & DL book |
| 2 | Jan. 7 Jan. 12 Jan. 14 |
Invariance, Equivariance, and Deep Learning Models for Sets & Sequences | slides | DeepSets & Transformers & PreNorm & VisionTransformers & SwinTransformers & Chapter 15 of PML1 book |
| 3 | Jan. 19 | Paper Presentations: 1. PointNet++ 2. Point Transformer V3 |
slides 1 slides 2 |
|
| 4 | Jan. 21 | Paper Presentations: 3. Mamba 4. xLSTM |
slides 3 slides 4 |
|
| 5 | Jan. 26 | Graph Neural Networks: Message Passing & Graph Convolution Models | slides I slides II |
Part II of GRL book & Chapter 23 of PML book & Chapter 4 of GNN book & GNNs & GGNNs & GAT & Graphormer & GPS & GCNs & ChebyNet & LanczosNet & SignNet & Specformer |
| 6 | Jan. 28 Feb. 4 |
Group Equivariant Deep Learning |
slides I slides II slides III slides IV |
UvAGEDL |
| 7 | Feb. 9 | Paper Presentations: 5. ViT Registers 6. G-CNNs |
slides 5 slides 6 |
|
| 8 | Feb. 11 | Paper Presentations: 7. Tensor-Field Networks 8. SE(3)-Transformers |
slides 7 slides 8 |
|
| 9 | Feb. 23 | Paper Presentations: 9. EGNNs 10. LieTransformers |
slides 9 slides 10 |
|
| 10 | Feb. 25 | Autoregressive Models & LLMs | slides | BERT & GPT3 & T5 & Scaling Laws & LoRA |
| 11 | Mar. 2 | Paper Presentations: 11. TIT 12. GRPO |
slides 11 slides 12 |
|
| 12 | Mar. 4 | Paper Presentations: 13. DPO 14. Learning Dynamics of DPO |
slides 13 slides 14 |
|
| 13 | Mar. 9 | Paper Presentations: 15. KTO 16. RLVR I |
slides 15 slides 16 |
|
| 14 | Mar. 11 | Paper Presentations: 17. RLVR II 18. VAR |
slides 17 slides 18 |
|
| 15 | Mar. 16 | AEs, and VAEs |
slides | Chapter 13, 14, 20 of DL book & AE & DAE & VAE & GraphVAE |
| 16 | Mar. 18 |
Diffusion Models | slides | Score-based Models & ScoreSDE & DDPM & DDIM & DPM++ |
| 17 | Mar. 23 | Flow Models | slides | Flow Matching & Rectified Flow & Stochastic Interpolants |
| 18 | Mar. 25 | Paper Presentations: 19. Rectified Flow 20. OT-CFM |
slides 19 slides 20 |
|
| 19 | Mar. 30 | Paper Presentations: 21. MDM 22. MDM Ordering |
slides 21 slides 22 |
|
| 20 | Apr. 1 | Paper Presentations: 23. DMD2 24. FlowEdit |
slides 23 slides 24 |
|
| 21 | Apr. 6 Apr. 8 |
Project Presentation |
I am very open to auditing guests if you are a member of the UBC community (registered student, staff, and/or faculty). I would appreciate that you first email me. If the in-person class is too full and running out of space, I would ask that you please allow registered students to attend.
While there is no required textbook, I recommend the following closely relevant ones for further reading:
In particular, for diffusion models, I recommend the following textbook on the relevant background mathematical materials:
I also recommend students who are self-motivated to take a look at similar courses taught at other universities: