CS 451 Quiz 31
Hyperparameter tuning and regularization
Two of the following 4 terms are synonyms. Which two?
Development ("dev") set
An important guideline is to make sure that dev set and test set come from the same distribution.
The "basic recipe for machine learning" checks for
high bias (using dev set performance) and high variance (using training set performance)
high bias (using training set performance) and high variance (using dev set performance)
The "basic recipe for machine learning" says we're done if the answers to (1) high bias? and (2) high variance? are:
(1) No (2) No
(1) Yes (2) No
(1) No (2) Yes
(1) Yes (2) Yes
What is Dropout?
A technique for speeding up training
A technique for regularization
A technique for reducing bias
How is "inverted dropout" implemented? Assume we use "keep_prob" P (e.g. P=0.8), so a fraction F = (1-P) of the nodes drop out (e.g. F = (1-P) = 20%).
(1) For each training example, select F of the nodes randomly and set their activation to 0
(2) Select F of the nodes randomly, then set their activation to 0 for all training examples
(3) Same as (1) but also divide all activations by P
(4) Same as (2) but also divide all activations by P
Do you use dropout at test time?
Why does dropout work?
It prevents overfitting by forcing the network to "spread out weights" and not to rely too much on any individual nodes / weights
It prevents underfitting by adding randomness
It prevents statistical correlations between training and test sets
Data augmentation: which statement is NOT true?
Data augmentation adds distorted/transformed versions of existing training examples to the training set
Data augmentation addresses overfitting
The video on data augmentation used a mirror-imaged picture of a cute puppy as an example
The video on data augmentation used rotated and distorted images of the digit "4" as examples
Which topic was NOT discussed in the assigned videos?
Early stopping as regularization technique
Vanishing gradients in shallow networks
Exploding gradients in deep networks
This content is neither created nor endorsed by Google.
Terms of Service