Deep Learning Adventures TensorFlow In Practice - Presentation 3
A quick overview of Courseraโs Tensorflow in Practice specialization course
Robert Kraig, David Patton, George Zoto
In the beginning...
Get to know our community
Hello, my name is ___ and this is my ___ meetup.
I enjoy ___ and I am interested in learning more about ___
Training set ๐
Hello, my name is George and this is my 4th meetup. I enjoy applying deep learning to solve interesting problems and I am interested in learning more about data augmentation, transfer learning and NLP.
Not a typical Meetupโฆ Get ready for a fun game ๐๐
Attribution to Coursera and deeplearning.ai
Chapter 1 - TensorFlow in Practice Specialization
Chapter 1 - TensorFlow in Practice Specialization
Setup
Colaboratory is a free Jupyter notebook environment that requires no setup and runs entirely in the cloud. You can write and execute code, save and share your analyses, and access powerful computing resources, all for free from your browser.
Course 2: Convolutional Neural Networks in TensorFlow
Week 1: Exploring a Larger Dataset
Week 2: Augmentation: A technique to avoid overfitting
Week 3: Transfer Learning
Week 4: Multiclass Classifications
Exploring a Larger Dataset
Human-Interactive Proof:
Kaggle Competition: Dogs vs Cats
https://www.kaggle.com/c/dogs-vs-cats/
Metric: Binary Classification Accuracy (i.e. no partial credit)
Winner: Pierre Sermanet ---> Overfeat
https://cilvr.nyu.edu/doku.php?id=software:overfeat:start
Exploring a Larger Dataset
Exploring a Larger Dataset
Exploring a Larger Dataset
Model is badly overfitting!
Training Loss Decreasing
Validation Loss Increasing
Bug fix:
Min Loss
Course 2: Convolutional Neural Networks in TensorFlow
Week 1: Exploring a Larger Dataset
Week 2: Augmentation: A technique to avoid overfitting
Week 3: Transfer Learning
Week 4: Multiclass Classifications
Augmentation: A technique to avoid overfitting
Augmentation: A technique to avoid overfitting
Augmentation: A technique to avoid overfitting
tf.keras.preprocessing.image.ImageDataGenerator
zoom_range: Float or [lower, upper]. Range for random zoom. If a float, [lower, upper] = [1-zoom_range, 1+zoom_range].
fill_mode: One of {"constant", "nearest", "reflect" or "wrap"}. Default is 'nearest'.Points outside the boundaries of the input are filled according to the given mode:
cval: Float or Int. Value used for points outside the boundaries when fill_mode = "constant".
preprocessing_function: function that will be applied on each input. The function will run after the image is resized and augmented.
l
original | constant (cval=0) | constant (cval=100) |
nearest | reflect | wrap |
Augmentation: A technique to avoid overfitting
Augmentation: A technique to avoid overfitting
Content added after Week 1 and 2, based on group feedback
Course 2: Convolutional Neural Networks in TensorFlow
Week 1: Exploring a Larger Dataset
Week 2: Augmentation: A technique to avoid overfitting
Week 3: Transfer Learning
Week 4: Multiclass Classifications
Transfer Learning
Extra Content - Classic Networks
Source: https://www.coursera.org/learn/convolutional-neural-networks/lecture/MmYe2/classic-networks๏ฟฝhttp://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf
1998
Extra Content - Classic Networks
Source: https://www.coursera.org/learn/convolutional-neural-networks/lecture/MmYe2/classic-networks๏ฟฝhttps://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf
2012
Extra Content - Classic Networks
Source: https://www.coursera.org/learn/convolutional-neural-networks/lecture/MmYe2/classic-networks๏ฟฝhttps://arxiv.org/pdf/1409.1556.pdf
2014
Extra Content - Classic Networks
Source: https://www.coursera.org/learn/convolutional-neural-networks/lecture/MmYe2/classic-networks๏ฟฝhttps://arxiv.org/pdf/1512.03385.pdf
2015
Extra Content - Classic Networks
Source: https://www.coursera.org/learn/convolutional-neural-networks/lecture/5WIZm/inception-network-motivation๏ฟฝhttps://arxiv.org/pdf/1409.4842.pdf
2014
Extra Content - Classic Networks
Source: https://arxiv.org/pdf/1605.07678.pdf
Transfer Learning
Source: http://www.image-net.org
The ILSVRC 2014 classification challenge involves the task of classifying the image into one of 1000 leaf-node categories in the Imagenet hierarchy. There are about 1.2 million images for training, 50,000 for validation and 100,000 images for testing. Each image is associated with one ground truth category, and performance is measured based on the highest scoring classifier predictions.
top-1 accuracy rate compares the ground truth against the first predicted class
top-5 error rate compares the ground truth against the first 5 predicted classes. An image is deemed correctly classified if the ground truth is among the top-5, regardless of its rank in them. The challenge uses the top-5 error rate for ranking purposes.
Transfer Learning
Transfer Learning
Course 2: Convolutional Neural Networks in TensorFlow
Week 1: Exploring a Larger Dataset
Week 2: Augmentation: A technique to avoid overfitting
Week 3: Transfer Learning
Week 4: Multiclass Classifications
Multiclass Classifications
Multiclass Classifications
Loss Functions for Multi-Class:
One-Hot Encoding: categorical_crossentropy
Integer Representation: sparse_categorical_crossentropy
Multiclass Classifications
The American Sign Language letter database of hand gestures represent a multi-class problem with 24 classes of letters (excluding J and Z which require motion).
Source: https://www.kaggle.com/datamunge/sign-language-mnist
Course 2: Convolutional Neural Networks in TensorFlow
Week 1: Exploring a Larger Dataset
Week 2: Augmentation: A technique to avoid overfitting
Week 3: Transfer Learning
Week 4: Multiclass Classifications
Extra Content - Regularization
Extra Content - Batch Normalization
Extra Content - Overfitting in Neural Networks
Source: https://www.tensorflow.org/tutorials/keras/overfit_and_underfit
https://ml-cheatsheet.readthedocs.io/en/latest/regularization.html
Total params: Tiny: 0.5K
Small: 0.7K
Medium: 10K
Large: 800K
The Higgs Dataset: 11M samples, 28 features, binary class
Content added after Week 3 and 4, based on group feedback
Course 2: Convolutional Neural Networks in TensorFlow - Questions
Check out these resources
Check out these events and Meetups
Source:
https://www.youtube.com/watch?v=zGFKSQlef_0
https://www.meetup.com/Bethesda-Artificial-Intelligence-Meetup/events/268561559/
https://www.meetup.com/DC-NLP/events/270026712/
https://www.meetup.com/Washington-Quantum-Computing-Meetup/events/269752061/
https://pennylane.ai/qml/demonstrations.html
https://www.meetup.com/QuantUniversity-Meetup-Washington-DC/events/270322193
https://www.meetup.com/ArtificialIntelligenceAndMachineLearning/events/268478285/
https://www.meetup.com/Machine-Learning-Paper-Club/events/270046053
Check out this new certification and new event
Letโs keep it up and continue our TensorFlow journey ๐
Time for a fun game ๐๐
Practice here or use Flashcards: https://quizizz.com/join/quiz/5eae10a807df73001b0c59a4/start?from=soloLinkShare&referrer=5d921444d0fa99001a135336
Time for a fun game ๐๐
Practice here or use Flashcards: https://quizizz.com/join/quiz/5eae10a807df73001b0c59a4/start?from=soloLinkShare&referrer=5d921444d0fa99001a135336
Questions
Discussion