CS 451 Quiz 30
Deep neural nets
* Required
Email address
*
Your email
What does "deep" mean in "deep learning"?
*
A large number of layers in a neural net
Any machine learning algorithm trained with a large number of training examples
Algorithmic tricks that speed up training by allowing "shortcuts" when computing gradients
A neural net employing convolution operations
Given a Llayer neural net, which of the following is NOT true?
*
x = a[0]
a[l] = g[l](z[l])
z[l] = a[l1] + b[l]
yhat = a[L]
For a neural net with n[2] = 10 and n[3] = 5, what is the shape of the matrix W[3]?
*
(5, 10)
(10, 5)
(5, 11)
(10, 6)
The vectorized equation for the first layer of a neural net is Z[1] = W[1] * X + b[1], where X has m columns. How do the shapes of W[1] * X and b[1] relate?
*
W[1] * X and b[1] have the same shape
They have the same number of rows, but b[1] has only one column (broadcasting takes place)
They have the same number of columns, but W[1] * X has more rows (broadcasting takes place)
Which of the following does NOT motivate why deep networks are a good idea?
*
Circuit theory shows that some functions can be computed easily with deep networks, while they would require an exponential number of hidden nodes in a shallow network
The universal approximation theorem states that a neural network with one hidden layer with a finite number of neurons can approximate any continuous function
Each layer can build more complex features from the features computed by the previous layer (e.g., lowlevel audio waveforms > phonemes > words > phrases)
Deep neural nets can be built using "building blocks". The building block for forward propagation takes a[l1] as input and produces a[l] as output. It also produces a "cache" output, which is
*
b[l]
g[l]
z[l]
The building block for backward propagation takes da[l] and the "cache" as input and produces da[l1] as output. It also produces additional outputs, which are
*
dW[l]
db[l]
both dW[l] and db[l]
Which of the following are hyperparameters of a neural net? Check all that apply.
*
Number of layers
Number of hidden units in each layer
Choice of activation function
The weight matrices W[l]
Required
What is Andrew Ng's answer to the question "What does this have to do with the brain"?
*
Human brains beat robot brains
The sigmoid function models an axon
Not a whole lot
I'd enjoy a free point on this quiz
*
True
False
Submit
This content is neither created nor endorsed by Google.
Report Abuse

Terms of Service

Additional Terms
Forms