Going Deeper with GoogLeNet and CaffeJS
“Deep Learning in the Browser”
Christoph Körner
1
Slides available on bit.ly/2cwhdyG
About me
2
Agenda
3
Deep Learning Refresher
4
Ingredients for Deep Classif.
5
Convolutions
6
Source: CS231n Github
Pooling
7
Source: CS231n Github
Parameters in Dense Layers
8
x*y*depth
neurons
Parameters in Convolutions
9
depth
filters
sx
sy
Deep Learning Architectures
10
Top-1 accuracy
11
Top-1 accuracy density
12
AlexNet: 62,378,344 params (250MB)
13
VGG (~400MB) and ResNet (~230MB)
14
GoogLeNet: 6,998,552 params (28MB)
15
GoogLeNet
“moving from fully connected
to sparsely connected architectures”
16
GoogLeNet
17
Inception Module (Network in a Network)
18
Source: Princeton 2015
Ingredients for Inception Module
19
Ingredient 1: Hebbian Principle
“Cells that fire together, wire together”
20
Source: Siegrid Löwel about Hebb’s Principle
Idea of Local Neighborhood
21
Idea of Local Neighborhood
22
Idea of Local Neighborhood
23
Idea of Local Neighborhood
24
Source: Princeton 2015
A Naive Approach (does not scale)
25
Source: Princeton 2015
Ingredient 2: Bottleneck Convolutions
26
Source: Eugenio Culurciello's blog
Inception Module
27
Source: Princeton 2015
Extensions to v1
28
Source: Eugenio Culurciello's blog
CaffeJS - DL in the Browser
29
Source: CaffeJS
CaffeJS
30
Why CaffeJS
31
Why CaffeJS
32
Using pre-trained Models
33
CaffeJS - Caffe Models in the Browser
34
CaffeJS vs. ConvNetJS
35
Demos with CaffeJS
36
Problems of CaffeJS
37
Whats next
38
Some more useful resources
39
Thank you.
40