Advanced deep learning techniques have revolutionized the field, enabling remarkable progress across various applications. This course will provide a comprehensive understanding of the latest models and methods that are shaping the future of deep learning, with a particular focus on probabilistic and geometric deep learning, and deep reinforcement learning.
We will cover following advanced topics:
Geometric Deep Learning: Graph Neural Networks (GNNs), Transformers, Group Equivariant Networks.
Probabilistic Deep Learning: Large Language Models (LLMs), VAEs, EBMs, Flow Models, Diffusion Models.
Students should ask all course-related questions on Piazza. We will use Canvas to handle submission and evaluation of all reports and project related files.
Instructor | Renjie Liao |
---|---|
TAs | Yuanpei Gao |
Section 1 | 1:30pm to 3:00pm, Monday |
Section 2 | 1:30pm to 3:00pm, Wednesday |
Location | Room 4018, Orchard Commons (ORCH) |
Piazza | https://piazza.com/ubc.ca/winterterm12024/eece571f |
Office Hour | 1:30pm to 2:30pm, Tuesday, KAIS 3047 (Ohm) |
renjie.liao@ubc.ca |
Grades will be based on:
Students can work on projects individually, or in groups of up to four (group should be formed as early as possible). Students are strongly encouraged to form groups via, e.g., discussing on Piazza. However, a larger group would be expected to do more than a smaller one or individuals. All students in a group will receive the same grade. Students are allowed to undertake a research project that is related to their thesis or other external projects, but the work done for this course must represent substantial additional work and cannot be submitted for grades in another course.
The grade will depend on the quality of research ideas, how well you present them in the report, how clearly you position your work relative to prior literature, how illuminating and or convincing your experiments are, and well-supported your conclusions are. Full marks will require a novel contribution.
Each group of students will write a short (>=2 pages) research project proposal, which ideally will be structured similarly to a standard paper. You don’t have to do exactly what your proposal claims - the point of the proposal is mainly to have a plan for yourself and to allow me to give you feedback. Students will do a short presentation (roughly 5 minutes for individual, 10 to 15 minutes for a larger group) for their projects towards the end of the course. At the end of the class, every group needs to submit a project report (6~8 pages).
All reports (i.e., paper reading report, proposal, peer-review report, and final project report) must be written in NeurIPS conference format and must be submitted as PDF
Late work will be automatically subject to a 20% penalty and can be submitted up to 3 days after the deadline
UBC values academic integrity. Therefore, all students must understand the meaning and consequences of cheating, plagiarism and other academic offences under the Code of Student Conduct and Discipline.
It is the responsibility of each student to understand the policy for each course work, ask questions to the instructor if it is not clear, and carefully acknowledge all sources (papers, code, books, websites, individual communications) using appropriate referencing style when submitting work.
This is a tentative schedule, which will likely change as the course goes on.
# | Dates | Lecture Topic | Lecture Slides | Suggested Readings |
---|---|---|---|---|
1 | Sep. 4 | Introduction to Deep Learning | slides | Chapter 13, 14 of PML1 book & DL book |
2 | Sep. 9 Sep. 11 Sep. 16 Sep. 18 |
Geometric Deep Learning: Invariance, Equivariance, and Deep Learning Models for Sets & Sequences | slides | DeepSets & Transformers & PreNorm & VisionTransformers & SwinTransformers & Chapter 15 of PML1 book |
3 | Sep. 23 Sep. 25 Oct. 2 |
Geometric Deep Learning: Graph Neural Networks: Message Passing Models | slides | Part II of GRL book & Chapter 23 of PML book & Chapter 4 of GNN book & GNNs & GGNNs & GAT & Graphormer & GPS |
4 | Oct. 7 Oct. 9 |
Geometric Deep Learning: Graph Neural Networks: Graph Convolution Models | slides | Part II of GRL book & Chapter 23 of PML book & Chapter 4 of GNN book & GCNs & ChebyNet & LanczosNet & SignNet & Specformer |
5 | Oct. 16 Oct. 21 |
Group Deep Learning: Regular Group Convolutional Neural Networks |
slides I slides II |
UvAGEDL |
6 | Oct. 21 Oct. 23 |
Group Deep Learning: Steerable Group Convolutional Neural Networks |
slides I slides II |
UvAGEDL |
7 | Oct. 28 | Probabilistic Deep Learning: LLMs | slides | BERT & GPT3 & T5 & Scaling Laws & LoRA |
8 | Oct. 30 Nov. 4 |
Probabilistic Deep Learning: AEs, and VAEs |
slides | Chapter 13, 14, 20 of DL book & AE & DAE & VAE & GraphVAE |
9 | Nov. 4 Nov. 6 |
Probabilistic Deep Learning: Energy-based Models (EBMs) | slides | Chapter 20 of DL book & RBMs & CD & DeepEBMs & Langevin Monte Carlo |
10 | Nov. 18 Nov. 20 |
Probabilistic Deep Learning: Diffusion Models | slides | Score-based Models & ScoreSDE & DDPM & DDIM & DPM++ |
11 | Nov. 25 Nov. 27 |
Probabilistic Deep Learning: Flow Models | slides | Flow Matching & Rectified Flow & Stochastic Interpolants |
12 | Dec. 2 Dec. 4 |
Project Presentation |
I am very open to auditing guests if you are a member of the UBC community (registered student, staff, and/or faculty). I would appreciate that you first email me. If the in-person class is too full and running out of space, I would ask that you please allow registered students to attend.
While there is no required textbook, I recommend the following closely relevant ones for further reading:
In particular, for diffusion models, I recommend the following textbook on the relevant background mathematical materials:
I also recommend students who are self-motivated to take a look at similar courses taught at other universities: