UC Berkeley CS194-080
Full Stack Deep Learning
Spring 2021
Deep learning is currently the best method for building models that learn from data. We first cover the principles and main building blocks of building models. We then zoom out to cover the “full stack” of going from a promising experiment to a shipped product: project structure, useful tooling, data management, best practices for deployment, social responsibility, and finding a job or starting a venture.
Instructors: Sergey Karayev and Josh Tobin and Pieter Abbeel
Our course will be entirely online (thanks, pandemic).
We will meet 5:00 - 8:00 pm every Tuesday according to the schedule below.
All synchronous meetings will occur on Zoom and will be recorded for those who can’t make it.
All communication will be done via Slack. You may send us a private message if you have questions that are relevant to only you, but keep most of the communication public to the course.
If you want to meet one of us for 15 or 30 min “office hours”, please send a private message on Slack.
Pieter’s OH are best used for questions about research and career advice.
If there is a suggested reading, please read it after the lecture and before submitting the assignment.
Weekly assignments will be released 8pm Tuesday, and due 5pm the following Tuesday. No late submission will be allowed. All assignments will be on Gradescope (you will be added to the course).
As part of the weekly assignments, we will be going through 10 labs, in which we will build a complete deep learning project to understand the content of handwritten paragraphs.
Our computing environment for the labs will be cloud-based. You don’t need to have access to a GPU or set your machine up.
For the final project, you can pair up or work individually. First, you will submit a proposal for a project involving any part of the full stack of deep learning that you expect to take roughly 40 hours per person total.
We have a list of project ideas that we will share. You’re also welcome to come up with your own. Feel free to submit your proposal as soon as you’re excited by an idea, and spend as much time as you want working on it!
Projects will be presented as five-minute videos and associated short reports (which we encourage you to post publicly online), and open sourcing the code is highly encouraged.
For more details on the project, see this doc.
The final grade will be calculated by weighing each of the 8 assignments by 7%, final project proposal by 10% and the final project by 34%. There will be no exam.
Berkeley students have access to recordings of our lectures in Canvas, available the morning after lecture.
We will release edited videos of our lectures every Monday starting February 1 (trailing this course by two weeks). Follow along if you’d like!
Lectures and Labs | Reading | Assignment | |
Week 1 - Jan 19 | Lecture 0: Introduction (Josh) [slides] Lecture 1: Fundamentals of Deep Learning (Sergey) Notebook: Coding a neural network from scratch (Sergey) Lab 1: Setup and Intro (Sergey) | How the backpropagation algorithm works Optional: | |
Week 2 - Jan 26 | Lecture 2.A: Convolutional Neural Networks (Josh) Lecture 2.B: Computer Vision Applications (Sergey) Lab 2: CNNs (Sergey) | The Building Blocks of Interpretability Brief Introduction to Neural Style Transfer Optional: - Diving deep into cross-entropy loss and softmax | |
Week 3 - Feb 2 | Lecture 3: Recurrent Neural Networks and Natural Language Applications (Josh) Lab 3: RNNs (Sergey) | The Unreasonable Effectiveness of Recurrent Neural Networks | |
Week 4 - Feb 9 | Lecture 4: Transfer Learning and Transformers (Sergey) Lab 4: Transformers (Sergey) | Notebook: Transformers [colab] | |
Week 5 - Feb 16 | Lecture 5: Machine Learning Projects (Josh) | ||
Week 6 - Feb 23 | Lecture 6: Infrastructure / Tooling (Sergey) | Machine Learning: The High-Interest Credit Card of Technical Debt | |
Week 7 - Mar 2 | Lecture 7: Troubleshooting DNNs (Josh) Lab 5: Experiment Management (Sergey) | ||
Week 8 - Mar 9 | Lecture 8: Data Management (Sergey) Lab 6: Data Labeling (Sergey) [video] [instructions] | ||
Week 9 - Mar 16 | Lecture 9: AI Ethics (Sergey) | Project Proposal due by end of March 19 | |
Week 10 - Mar 23 | 🏝 Spring Break 🏝 | (Work on project) | |
Week 11 - Mar 30 | Lecture 10: Testing & Explainability (Josh) | Work on project | |
Week 12 - Apr 6 | Lecture 11: Deployment & Monitoring (Josh) [video - part 1] [video - part 2] [slides] Lab 8: Testing (Sergey) | Work on project | |
Week 13 - Apr 13 | Lecture 12: Research Directions (Pieter) Lab 9: Deployment (Sergey) | Work on project | |
Week 14 - Apr 20 | Lecture 13: ML Teams and Startups (Josh) Panel Discussion: Do I need a PhD to do Machine Learning? [video] Lab 10: Monitoring (Sergey) [readme] | Project due May 9 | |
Week 15 - Apr 27 | Work on project | ||
Week 16 - May 4 | Work on project | ||
Week 17 - May 11 | Project presentations |
This course was originally developed by the three of us as a weekend bootcamp. You can preview lectures by watching recordings from November 2019 on our online course website.
Fast.ai is a great free two-course sequence aimed at first getting hackers to train state-of-the-art models as quickly as possible, and only afterward delving into how things work under the hood. Highly recommended for anyone.
https://d2l.ai is a great free textbook with Jupyter notebooks for every part of deep learning.
NYU’s Deep Learning course has excellent PyTorch breakdowns of everything important going on in deep learning.
Stanford’s ML Systems Design course has lectures that parallel those in this course.
The Batch by Andrew Ng is a great weekly update on progress in the deep learning world.
/r/MachineLearning/ is the best community for staying up to date with the latest developments.