1 of 96

Introduction to Deep Learning with Python

Samar Haider

University of Southern California

11/7/2019

2 of 96

Artificial intelligence, machine learning & deep learning

3 of 96

The AI universe

USC 2019

3

Samar Haider

Goodfellow et. al., “Deep Learning.” MIT Press (2016)

4 of 96

The AI universe

“What's the difference between AI and ML?”

“It's AI when you're raising money, it's ML when you're trying to hire people.”

USC 2019

4

Samar Haider

5 of 96

Machine learning

USC 2019

5

Samar Haider

Data

6 of 96

Machine learning

USC 2019

6

Samar Haider

Learning Algorithm

Data

7 of 96

Machine learning

USC 2019

7

Samar Haider

Hypothesis

Learning Algorithm

Data

8 of 96

Machine learning

USC 2019

8

Samar Haider

Hypothesis

Input

Prediction

Data

Learning Algorithm

9 of 96

Machine learning

USC 2019

9

Samar Haider

Engineer features

Learn mapping

Data

Features

Labels

10 of 96

Deep learning

USC 2019

10

Samar Haider

Learn features

Learn mapping

Data

Features

Labels

11 of 96

The rise of deep learning

USC 2019

11

Samar Haider

12 of 96

The rise of deep learning

USC 2019

12

Samar Haider

Goodfellow et. al., “Deep Learning.” MIT Press (2016)

13 of 96

The rise of deep learning

USC 2019

13

Samar Haider

Dean, “Large-Scale Deep Learning for Intelligent Computer Systems.” WSDM (2016)

14 of 96

The rise of deep learning

  1. Bigger datasets
  2. More computational power
  3. Improvements in algorithms (due to 1 and 2)

USC 2019

14

Samar Haider

15 of 96

The rise of deep learning

“If big data is the new oil, deep learning is the new internal combustion engine.”

– Yann LeCun

(Director, Facebook AI Research)

USC 2019

15

Samar Haider

16 of 96

The rise of deep learning

“AI is the new electricity: Just as electricity transformed almost everything 100 years ago, today I actually have a hard time thinking of an industry that I don’t think AI will transform in the next several years.”

– Andrew Ng

(Founder, deeplearning.ai)

USC 2019

16

Samar Haider

17 of 96

Neural networks

18 of 96

Biological neuron

USC 2019

18

Samar Haider

19 of 96

Artificial neuron

USC 2019

19

Samar Haider

20 of 96

Biological vs artificial neuron

USC 2019

20

Samar Haider

21 of 96

Activation functions

USC 2019

21

Samar Haider

Sze et. al., “Efficient Processing of Deep Neural Networks: A Tutorial and Survey.” arXiv (2017)

22 of 96

Activation functions

USC 2019

22

Samar Haider

LeCun et. al., “Deep Learning.” Nature (2015)

23 of 96

A shallow neural network

USC 2019

23

Samar Haider

24 of 96

A deep neural network

USC 2019

24

Samar Haider

25 of 96

What we want

USC 2019

25

Samar Haider

Good features/representations

Correct predictions

26 of 96

What we want

USC 2019

26

Samar Haider

Goodfellow et. al., “Deep Learning.” MIT Press (2016)

27 of 96

The need for depth

USC 2019

27

Samar Haider

Goodfellow et. al., “Deep Learning.” MIT Press (2016)

28 of 96

… and even more depth

USC 2019

28

Samar Haider

Szegedy et. al., “Going Deeper with Convolutions.” CVPR (2015)

29 of 96

Specialized architectures

Vision: Convolutional Neural Networks

USC 2019

29

Samar Haider

30 of 96

Specialized architectures

Language: Recurrent Neural Networks

USC 2019

30

Samar Haider

31 of 96

Deep learning

32 of 96

The learning process

  1. Pick a training example
  2. Make a prediction for it
  3. Compare your prediction with the truth (= error)
  4. Modify your weights in order to minimize this error
  5. Repeat until convergence

USC 2019

32

Samar Haider

33 of 96

Minimizing the error

USC 2019

33

Samar Haider

34 of 96

Gradient descent

USC 2019

34

Samar Haider

35 of 96

Gradient descent in higher dimensions

USC 2019

35

Samar Haider

36 of 96

Learning multiple layers

USC 2019

36

Samar Haider

37 of 96

Forward pass

USC 2019

37

Samar Haider

38 of 96

Backward propagation

USC 2019

38

Samar Haider

39 of 96

Backward propagation

USC 2019

39

Samar Haider

40 of 96

Backward propagation

USC 2019

40

Samar Haider

41 of 96

Backward propagation

USC 2019

41

Samar Haider

42 of 96

Backpropagation algorithm in full

USC 2019

42

Samar Haider

LeCun et. al., “Deep Learning.” Nature (2015)

43 of 96

Applications

44 of 96

Object Detection

USC 2019

44

Samar Haider

Ren et. al., “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.” NIPS (2015)

45 of 96

Scene segmentation

USC 2019

45

Samar Haider

Badrinarayanan et. al., “SegNet: A Deep Convolutional Encoder-Decoder Architecture…” PAMI (2016)

46 of 96

Super resolution

USC 2019

46

Samar Haider

Dahl et. al., “Pixel Recursive Super Resolution.” arXiv (2017)

47 of 96

Style transfer

USC 2019

47

Samar Haider

Gatys et. al., “A Neural Algorithm of Artistic Style.” arXiv (2015)

48 of 96

Image translation

USC 2019

48

Samar Haider

Liu et. al., “Unsupervised Image-to-Image Translation Networks.” NIPS (2017)

49 of 96

Image generation

USC 2019

49

Samar Haider

Karras et. al., “Progressive Growing of GANs for Improved Quality, Stability, and Variation.” arXiv (2017)

50 of 96

Image generation

USC 2019

50

Samar Haider

Karras et. al., “Progressive Growing of GANs for Improved Quality, Stability, and Variation.” arXiv (2017)

51 of 96

Learning word representations

USC 2019

51

Samar Haider

Mikolov et. al., “Efficient Estimation of Word Representations in Vector Space.” arXiv (2013)

52 of 96

Learning sentiment representations

USC 2019

52

Samar Haider

Radford et. al., “Learning to Generate Reviews and Discovering Sentiment.” arXiv (2017)

53 of 96

Writing stories

USC 2019

53

Samar Haider

Radford et. al., “Language Models are Unsupervised Multitask Learners.” OpenAI (2019)

54 of 96

Image captioning

USC 2019

54

Samar Haider

Vinyals et. al., “Show and Tell: A Neural Image Caption Generator.” CVPR (2015)

55 of 96

Visual question answering

USC 2019

55

Samar Haider

Yang et. al., “Stacked Attention Networks for Image Question Answering.” CVPR (2016)

56 of 96

Playing games

USC 2019

56

Samar Haider

Silver et. al., “Mastering the Game of Go with Deep Neural Networks and Tree Search.” Nature (2016)

57 of 96

Building better neural networks

USC 2019

57

Samar Haider

Zoph et. al., “Neural Architecture Search with Reinforcement Learning.” ICLR (2017)

58 of 96

Building better software

USC 2019

58

Samar Haider

Kraska et. al., “The Case for Learned Index Structures.” arXiv (2017)

59 of 96

What you need to get started with deep learning

60 of 96

These, pretty much

USC 2019

60

Samar Haider

61 of 96

… plus a handful of other stuff

USC 2019

61

Samar Haider

https://medium.com/towards-data-science/building-your-own-deep-learning-box-47b918aea1eb

62 of 96

Building a deep learning rig

USC 2019

62

Samar Haider

https://pcpartpicker.com/list/FRp8XH

63 of 96

An alternative

USC 2019

63

Samar Haider

64 of 96

Amazon Web Services

USC 2019

64

Samar Haider

65 of 96

Amazon Web Services

USC 2019

65

Samar Haider

66 of 96

Amazon Web Services

USC 2019

66

Samar Haider

67 of 96

Deep learning software ecosystem

USC 2019

67

Samar Haider

https://towardsdatascience.com/deep-learning-framework-power-scores-2018-23607ddf297a

68 of 96

A typical beginner stack

USC 2019

68

Samar Haider

69 of 96

A typical beginner stack

USC 2019

69

Samar Haider

70 of 96

A typical beginner stack

USC 2019

70

Samar Haider

71 of 96

A typical beginner stack

USC 2019

71

Samar Haider

72 of 96

Another beginner stack

USC 2019

72

Samar Haider

73 of 96

Deep learning in a day

  1. Create an AWS account
  2. Launch an EC2 instance
  3. SSH into your instance
  4. Launch a Jupyter Notebook
  5. ???
  6. Profit!

USC 2019

73

Samar Haider

74 of 96

USC 2019

74

Samar Haider

75 of 96

USC 2019

75

Samar Haider

76 of 96

USC 2019

76

Samar Haider

77 of 96

USC 2019

77

Samar Haider

78 of 96

USC 2019

78

Samar Haider

79 of 96

USC 2019

79

Samar Haider

80 of 96

USC 2019

80

Samar Haider

81 of 96

$ssh -i ./aws-key.pem ubuntu@ec2-34-211-139-121.us-west-2.compute.amazonaws.com

$jupyter notebook

USC 2019

81

Samar Haider

82 of 96

USC 2019

82

Samar Haider

83 of 96

USC 2019

83

Samar Haider

84 of 96

USC 2019

84

Samar Haider

85 of 96

USC 2019

85

Samar Haider

86 of 96

A free alternative: Google Colab

USC 2019

86

Samar Haider

87 of 96

Where to learn more

88 of 96

Courses

fast.ai

by Jeremy Howard

USC 2019

88

Samar Haider

89 of 96

Courses

USC 2019

89

Samar Haider

deeplearning.ai

by Andrew Ng

90 of 96

Books

USC 2019

90

Samar Haider

Goodfellow

Bengio

& Courville

91 of 96

Books

USC 2019

91

Samar Haider

Michael Nielsen

92 of 96

Books

USC 2019

92

Samar Haider

François Chollet

93 of 96

Papers

USC 2019

93

Samar Haider

Yann LeCun, Yoshua Bengio, and Geoffrey Hinton, “Deep Learning.” Nature (2015)

Most Cited Deep Learning Papers

https://github.com/terryum/awesome-deep-learning-papers

Deep Learning Papers Reading Roadmap

https://github.com/songrotek/Deep-Learning-Papers-Reading-Roadmap

94 of 96

Demos

USC 2019

94

Samar Haider

TensorFlow Playground

https://playground.tensorflow.org/

ConvNetJS

https://cs.stanford.edu/people/karpathy/convnetjs/

Quick, Draw!

https://quickdraw.withgoogle.com/

95 of 96

“Software is eating the world, but AI is going to eat software.”

– Jensen Huang

(CEO, Nvidia)

96 of 96

Thank you

@samarhdr