CS 451 Quiz 9
Neural Nets and Backpropagation
Email address *
Intuitively, given a single training example, backpropagation computes the contribution of the error the network makes on this example at every node in the network *
Backpropagation is an algorithm for computing the partial derivatives of the cost function with respect to all parameters theta *
Suppose we have a neural net with 4 layers (1 input layer, 2 hidden layers, and 1 output layer). How many matrices Theta(j) are needed to store all the weights in the network? *
What is the following formula? *
Captionless Image
The Boolean NOT function can be computed by a single neuron. *
To build a neural net for multiclass classification, the number of classes should match the number of nodes in the *
Backpropagation starts by computing the errors at the input layer and then propagates them through the network *
For a large training set, computing the gradient of the cost function involves *
The "LeNet" was an early example of a successful application of neural nets to *
The Boolean XOR function can be computed by a single neuron. *
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Additional Terms