Published using Google Docs
FSDL - Berkeley Spring 2021
Updated automatically every 5 minutes

UC Berkeley CS194-080

Full Stack Deep Learning

Spring 2021

Overview

Deep learning is currently the best method for building models that learn from data. We first cover the principles and main building blocks of building models. We then zoom out to cover the “full stack” of going from a promising experiment to a shipped product: project structure, useful tooling, data management, best practices for deployment, social responsibility, and finding a job or starting a venture.

Instructors: Sergey Karayev and Josh Tobin and Pieter Abbeel

Our course will be entirely online (thanks, pandemic).

We will meet 5:00 - 8:00 pm every Tuesday according to the schedule below.

All synchronous meetings will occur on Zoom and will be recorded for those who can’t make it.

All communication will be done via Slack. You may send us a private message if you have questions that are relevant to only you, but keep most of the communication public to the course.

If you want to meet one of us for 15 or 30 min “office hours”, please send a private message on Slack.

Pieter’s OH are best used for questions about research and career advice.

Assignments and Labs

If there is a suggested reading, please read it after the lecture and before submitting the assignment.

Weekly assignments will be released 8pm Tuesday, and due 5pm the following Tuesday. No late submission will be allowed. All assignments will be on Gradescope (you will be added to the course).

As part of the weekly assignments, we will be going through 10 labs, in which we will build a complete deep learning project to understand the content of handwritten paragraphs.

Our computing environment for the labs will be cloud-based. You don’t need to have access to a GPU or set your machine up.

Final Project

For the final project, you can pair up or work individually. First, you will submit a proposal for a project involving any part of the full stack of deep learning that you expect to take roughly 40 hours per person total.

We have a list of project ideas that we will share. You’re also welcome to come up with your own. Feel free to submit your proposal as soon as you’re excited by an idea, and spend as much time as you want working on it!

Projects will be presented as five-minute videos and associated short reports (which we encourage you to post publicly online), and open sourcing the code is highly encouraged.

For more details on the project, see this doc.

Grading

The final grade will be calculated by weighing each of the 8 assignments by 7%, final project proposal by 10% and the final project by 34%. There will be no exam.

Private Lecture Videos

Berkeley students have access to recordings of our lectures in Canvas, available the morning after lecture.

Public Lecture Videos

We will release edited videos of our lectures every Monday starting February 1 (trailing this course by two weeks). Follow along if you’d like!

Schedule

Lectures and Labs

Reading

Assignment

Week 1 - Jan 19

Lecture 0: Introduction (Josh)

[slides]

Lecture 1: Fundamentals of Deep Learning (Sergey)

[video] [slides]

Notebook: Coding a neural network from scratch (Sergey)

[video] [colab]

Lab 1: Setup and Intro (Sergey)

[video] [readme] [slides]

How the backpropagation algorithm works

Optional:

FastAI Book: Foundations Chapter

Assignment 1

Week 2 - Jan 26

Lecture 2.A: Convolutional Neural Networks (Josh)

[video] [slides]

Lecture 2.B: Computer Vision Applications (Sergey)

[video] [slides]

Lab 2: CNNs (Sergey)

[video] [readme]

The Building Blocks of Interpretability

Brief Introduction to Neural Style Transfer

Optional:

- Diving deep into cross-entropy loss and softmax

Assignment 2

Week 3 - Feb 2

Lecture 3: Recurrent Neural Networks and Natural Language Applications (Josh)

[video] [slides]

Lab 3: RNNs (Sergey)

[video] [readme]

The Unreasonable Effectiveness of Recurrent Neural Networks

Attention Craving RNNS: Building Up To Transformer Networks

Assignment 3

Week 4 - Feb 9

Lecture 4: Transfer Learning and Transformers (Sergey)

[video] [slides]

Lab 4: Transformers (Sergey)

[video] [readme]

Transformers from Scratch

Notebook: Transformers

[colab]

MinGPT

Assignment 4

Week 5 - Feb 16

Lecture 5: Machine Learning Projects (Josh)

[video] [slides]

Rules of Machine Learning

Assignment 5

Week 6 -

Feb 23

Lecture 6: Infrastructure / Tooling (Sergey)

[video] [slides]

Machine Learning: The High-Interest Credit Card of Technical Debt

Assignment 6

Week 7 - Mar 2

Lecture 7: Troubleshooting DNNs (Josh)
[
video] [slides]

Lab 5: Experiment Management (Sergey)

[video] [readme]

Why is Machine Learning “Hard”?

Assignment 7

Week 8 - Mar 9

Lecture 8: Data Management (Sergey)

[video] [slides]

Lab 6: Data Labeling (Sergey)

[video] [instructions]

Emerging Architectures for Modern Data Infrastructure

Assignment 8

Week 9 - Mar 16

Lecture 9: AI Ethics (Sergey)

[video] [slides]

Project Proposal due by end of March 19

Week 10 -

Mar 23

🏝 Spring Break 🏝

(Work on project)

Week 11 - Mar 30

Lecture 10: Testing & Explainability (Josh)

[video] [slides]

Lab 7: Paragraph Recognition (Sergey) [video] [readme]

Work on project

Week 12 - Apr 6

Lecture 11: Deployment & Monitoring (Josh)

[video - part 1] [video - part 2] [slides]

Lab 8: Testing (Sergey)

[video] [readme]

The ML Test Score: a Rubric for Production Readiness

Work on project

Week 13 -

Apr 13

Lecture 12: Research Directions (Pieter)

[video] [slides]

Lab 9: Deployment (Sergey)

[video] [readme]

Work on project

Week 14 - Apr 20

Lecture 13: ML Teams and Startups (Josh)

[video] [slides]

Panel Discussion: Do I need a PhD to do Machine Learning? [video]

Lab 10: Monitoring (Sergey)

[readme]

Project due May 9

Week 15 - Apr 27

Work on project

Week 16 - May 4

Work on project

Week 17 - May 11

Project presentations

Other resources

This course was originally developed by the three of us as a weekend bootcamp. You can preview lectures by watching recordings from November 2019 on our online course website.

Fast.ai is a great free two-course sequence aimed at first getting hackers to train state-of-the-art models as quickly as possible, and only afterward delving into how things work under the hood. Highly recommended for anyone.

https://d2l.ai is a great free textbook with Jupyter notebooks for every part of deep learning.

NYU’s Deep Learning course has excellent PyTorch breakdowns of everything important going on in deep learning.

Stanford’s ML Systems Design course has lectures that parallel those in this course.

The Batch by Andrew Ng is a great weekly update on progress in the deep learning world.

/r/MachineLearning/ is the best community for staying up to date with the latest developments.