CS 451 Quiz 6
Regularization, Numpy
* Required
Email address
*
Your email
To address overfitting, one can (check all that apply)
*
reduce the number of features
increase the number of features
use regularization
Required
The regularization term affects all parameters except theta_0
*
True
False
In the regularized cost function, the parameter lambda controls the weight of the regularization term. What happens if lambda is very large?
*
We get overfitting
We get underfitting
Gradient descent fails to converge
Regularization means
*
adding an extra term to the cost function that makes gradient descent converge faster
adding an extra term to the cost function that encourages small parameter values
adding polynomial features so the model becomes more expressive
limiting the number of nonzero parameters
An algorithm that does overfitting has
*
high variance
high bias
An algorithm that does overfitting
*
predicts both training examples and unseen test examples well
predicts training examples well, but does not predict unseen test examples well
does not predict the training examples well, but does predict unseen test examples well
predicts neither the training examples nor unseen test examples well
Once we add a regularization term to linear regression we can only use gradient descent for minimization, since there is no normal equation for an analytical solution
*
True
False
We can use regularization only with linear regression, not with logistic regression
*
True
False
What would the following numpy code be in Octave?
*
a = [1, 2, 3; 4, 5, 6];
a = [1, 4; 2, 5; 3, 6];
a = [[1, 2, 3], [4, 5, 6]];
Suppose N is a 3x3 numpy array and M is a 3x3 Octave matrix. What would the expression N[1][0] be in Octave?
*
M(1, 0)
M[1, 0]
M(2, 1)
M[1, 2]
Submit
This content is neither created nor endorsed by Google.
Report Abuse

Terms of Service

Additional Terms
Forms