1 of 7

Livecoding Madness

Let's Build a Deep Learning Library

Joel Grus

@joelgrus

Research Engineer, Allen Institute for AI

2 of 7

about me

  • research engineer at AI2
  • wrote a book
  • good at Twitter
  • co-host Adversarial Learning podcast

3 of 7

livecoding

  • Type hint all the things
  • Python 3.6
  • Type really fast
  • Talk really fast
  • CW: swearing-at-text-editor
  • Make a library
  • Use good abstractions
  • Use good variable names
  • Document what we're doing
  • Lean on mypy and pylint
  • (You) tell me when I screw up!

4 of 7

deep learning in one slide

represent input data as multidimensional arrays

image = [[[255, 255, 0], [127, 127, 127], …]]

predict outputs using a (parameterized) deep neural network

loss function depends smoothly on the parameters + tells how good our predictions are

use (calculus + greediness + cleverness) to find parameters that minimize "loss"

5 of 7

the plan

  1. Tensors
  2. Loss Functions
  3. Layers
  4. Neural Nets
  5. Optimizers
  6. Data
  7. Training
  8. End-to-end example: XOR
  9. End-to-end example: Fizz Buzz (yes, that again)

6 of 7

7 of 7