CS 451 Quiz 30
Deep neural nets
What does "deep" mean in "deep learning"?
A large number of layers in a neural net
Any machine learning algorithm trained with a large number of training examples
Algorithmic tricks that speed up training by allowing "shortcuts" when computing gradients
A neural net employing convolution operations
Given a L-layer neural net, which of the following is NOT true?
x = a
a[l] = g[l](z[l])
z[l] = a[l-1] + b[l]
y-hat = a[L]
For a neural net with n = 10 and n = 5, what is the shape of the matrix W?
The vectorized equation for the first layer of a neural net is Z = W * X + b, where X has m columns. How do the shapes of W * X and b relate?
W * X and b have the same shape
They have the same number of rows, but b has only one column (broadcasting takes place)
They have the same number of columns, but W * X has more rows (broadcasting takes place)
Which of the following does NOT motivate why deep networks are a good idea?
Circuit theory shows that some functions can be computed easily with deep networks, while they would require an exponential number of hidden nodes in a shallow network
The universal approximation theorem states that a neural network with one hidden layer with a finite number of neurons can approximate any continuous function
Each layer can build more complex features from the features computed by the previous layer (e.g., low-level audio waveforms -> phonemes -> words -> phrases)
Deep neural nets can be built using "building blocks". The building block for forward propagation takes a[l-1] as input and produces a[l] as output. It also produces a "cache" output, which is
The building block for backward propagation takes da[l] and the "cache" as input and produces da[l-1] as output. It also produces additional output(s), which is/are
both dW[l] and db[l]
Which of the following are hyperparameters of a neural net? Check all that apply.
Number of layers
Number of hidden units in each layer
Choice of activation function
The weight matrices W[l]
What is Andrew Ng's answer to the question "What does this have to do with the brain"?
Human brains beat robot brains
The sigmoid function models an axon
Not a whole lot
I'm in class today and I'd enjoy a free point for that.
This content is neither created nor endorsed by Google.
Terms of Service