Deep Learning has transformed fields like computer vision, speech recognition, and natural language processing, becoming a cornerstone of modern machine learning and artificial intelligence. Its significance has made deep learning expertise one of the most sought-after skills in industry and academia.
This course provides a comprehensive exploration of foundational and advanced topics in deep learning. You will develop a solid understanding of key statistical learning concepts and dive into modern neural network architectures, including convolutional networks, recurrent networks, transformers, and graph neural networks. The course emphasizes hands-on learning, equipping you with practical experience in training deep learning models and mastering essential techniques such as backpropagation, batch normalization, and optimization strategies. A particular highlight of the course is its focus on generative models, introducing you to a range of approaches like autoregressive models (e.g., large language models), Variational Autoencoders (VAEs), and denoising diffusion models (DDPMs). These models are at the forefront of the so-called Generative AI. Additionally, the curriculum integrates deep reinforcement learning, combining deep learning techniques with reinforcement learning methods to address complex, real-world problems.
Carefully designed to balance theoretical foundations with practical applications, this course prepares you to tackle advanced challenges in deep learning and apply your skills across various domains.
Instructor | Renjie Liao |
---|---|
TA | Qi Yan, Sadegh Mahdavi, Qihang Zhang, Felix Fu |
Time | 12:30pm to 2:00pm, Tue. and Thu. |
Location | Hugh Dempster Pavilion (DMP) - 310 |
Piazza | https://piazza.com/ubc.ca/winterterm22024/cpen455 |
Canvas | CPEN 455 201 2024W2 |
Tutorial | 1:00pm to 2:00pm Mon. Forest Sciences Centre (FSC) - 1005 |
Office Hour | 2:00pm to 3:00pm Wed. KAIS 3028 |
renjie.liao@ubc.ca |
All course-related questions should be sent and handled via Piazza. Canvas is only used for submitting homework, assignments, and projects. Try to avoid sending me emails directly as it is likely to be buried in my inbox.
All homework, assignments, and projects must be done individually. A 20% (non-hourly based) penalty is applied to any late submission unless special circumstances like illness. Any submission that is later than 3 days after the deadline will not be evaluated. E.g., if your homework is late but within 3 days after the deadline, your receive 80% of the grade for the homework. If it is beyond 3 days, then you get 0 grade.
You can use tools like ChatGPT in doing homework, assignments, and so on. If you use it, we request you to acknowledge it in your submitted materials and to submit your prompts (e.g., screenshots) for us to investigate the use cases. Ideally, you should restrict most of use cases to improving the english writing instead of getting answers directly.
UBC values academic integrity. Therefore, all students must understand the meaning and consequences of cheating, plagiarism and other academic offences under the Code of Student Conduct and Discipline.
This is a tentative schedule, which will likely change as the course goes on. Changes will be announced on Piazza and this website.
Lectures | Dates | Topic | Slides | Suggested Readings |
---|---|---|---|---|
Lecture 1 | Jan. 7 | Introduction | slides 1 | DL book: ch. 1 |
Lecture 2 | Jan. 9 Jan. 14 |
Linear Models for Regression & Classification | slides 2 | DL book: ch. 5; PML1 book: ch. 10 & 11; PRML book: ch. 3 & 4 |
Jan. 10 | Release Homework 1 (due Jan. 31) | |||
Lecture 3 | Jan. 16 | Multilayered Perceptron (MLP), Batch Normalization, Dropout | slides 3 | DL book: ch. 6 & 7; PML1 book: ch. 13; PML2 book: ch. 16; PRML book: ch. 5 |
Lecture 4 | Jan. 21 Jan. 23 Jan. 28 |
Back-Propagation, Optimization Methods w. Adaptive Learning Rate, Weight Initialization, Weight Decay, Early Stopping | slides 4.1 slides 4.2 |
DL book: ch. 7 & 8; PML1 book: ch. 13; PML2 book: ch. 6; PRML book: ch. 5 |
Jan. 31 | Release Homework 2 (due Feb. 21) | |||
Lecture 5 | Jan. 30 Feb. 4 Feb. 6 |
Invariance, Equivairance, Convolutions and Variants (Transposed, Dilated, Grouped, Seperable), Pooling, CNNs (UNet, ResNet, MobileNet) | slides 5.1 slides 5.2 |
DL book: ch. 9; PML1 book: ch. 14; PML2 book: ch. 16 |
Lecture 6 | Feb. 11 | Recurrent Neural Networks | slides 6 | |
Feb. 21 | Release Programming Assignment 1 (due Mar. 7) | |||
Lecture 7 | Feb. 13 | Graph Neural Networks | slides 7 | PML1 book: ch. 23; PML2 book: ch. 16 |
Mar. 3 | Release Course Project (due Apr. 15) | |||
Lecture 8 | Feb. 25 Feb. 27 Mar. 4 |
Transformers | slides 8 | |
Mar. 7 | Release Programming Assignment 2 (due Mar. 21) | |||
Lecture 9 | Mar. 6 Mar. 11 |
Large Language Models (LLMs) | slides 9 | PML1 book: ch. 15; PML2 book: ch. 22 |
Lecture 10 | Mar. 13 Mar. 18 |
Autoencoders, Denoising Autoencoders, Variational Autoencoders (VAEs) | slides 10 slides 10 by Qihang |
DL book: ch. 14; PML1: ch. 20; PML2 book: ch. 21 |
Mar. 21 | Release Programming Assignment 3 (due Apr. 4) | |||
Lecture 11 | Mar. 20 Mar. 25 |
Denoising Diffusion Models: DDPMs, Score-based Models | slides 11 | PML2 book: ch. 25 |
Lecture 12 | Mar. 27 Apr. 1 Apr. 3 Apr. 8 |
Deep Reinforcement Learning: MDPs, Bellman Equations, Q-Learning, Policy Gradient | slides 12.1 slides 12.2 |
PML2 book: ch. 35 |
Tutorials | Dates | Topic | Slides | Suggested Readings |
---|---|---|---|---|
Tutorial 1 | Jan. 13 | Probability & Statistics | slides | |
Tutorial 2 | Jan. 20 | Linear Algebra & Matrix Calculus | slides | |
Tutorial 3 | Jan. 27 | Tensor Operations & Dataset & DataLoader | slides Colab |
Pytorch Official Tutorials |
Tutorial 4 | Feb. 3 | Autograd & Build Your Models | slides Colab |
|
Tutorial 5 | Feb. 10 | Training and Testing Your Model & Hyper-parameter Tuning | slides Colab |
Google Tuning Playbook |
Tutorial 6 | Feb. 24 | Course Project Introduction & Discussion on HW1 | slides | |
Tutorial 7 | Mar. 3 | Discussion on Quiz 1 | slides | |
Tutorial 8 | Mar. 10 | Hugging Face Transformers & PixelCNN | slides | |
Tutorial 9 | Mar. 17 | PixelCNN++ | slides | |
Tutorial 10 | Mar. 24 | Discussion on PA1 + PA2 | slides | |
Tutorial 11 | Mar. 31 | Discussion on PA3 + HW2 | N/A | |
Tutorial 12 | Apr. 7 | Discussion on Quiz 3 | N/A |
I am very open to auditing guests if you are a member of the UBC community (registered student, staff, and/or faculty). I would appreciate that you first email me. If the in-person class is too full and running out of space, I would ask that you please allow registered students to attend.
While there is no required textbook, I recommend the following closely relevant ones for further reading:
I also recommend students who are self-motivated to take a look at similar courses taught at other universities: