CS 451 Quiz 9
Neural Nets and Backpropagation
* Required
Email address
*
Your email
Intuitively, given a single training example, backpropagation computes the contribution of the error the network makes on this example at every node in the network
*
True
False
Backpropagation is an algorithm for computing the partial derivatives of the cost function with respect to all parameters theta
*
True
False
Suppose we have a neural net with 4 layers (1 input layer, 2 hidden layers, and 1 output layer). How many matrices Theta(j) are needed to store all the weights in the network?
*
3
4
5
What is the following formula?
*
The logistic cost function
The cost function for neural nets
The gradient term for neural nets
None of the above
The Boolean NOT function can be computed by a single neuron.
*
True
False
To build a neural net for multiclass classification, the number of classes should match the number of nodes in the
*
input layer
hidden layer
output layer
Backpropagation starts by computing the errors at the input layer and then propagates them through the network
*
True
False
For a large training set, computing the gradient of the cost function involves
*
forward propagation, then backpropagation of a single training example, repeated for all training examples
forward propagation of all training examples, then backpropagation of all training examples
The "LeNet" was an early example of a successful application of neural nets to
*
Disease classification
Machine translation from French to English
Handwritten digit recognition
The Boolean XOR function can be computed by a single neuron.
*
True
False
Submit
This content is neither created nor endorsed by Google.
Report Abuse

Terms of Service

Additional Terms
Forms