CS 451 Quiz 5
Logistic Regression
Email address *
We need to use a different cost function for logistic regression than for linear regression in order to obtain a convex minimization problem without multiple local minima *
Logistic regression is an algorithm for *
One reason linear regression is not good for classification is that a single training example, even if far from the decision boundary, can adversely influence the results *
In logistic regression, which equation defines the decision boundary? *
For which algorithm is 0 <= h_theta(x) <= 1 always true? *
Required
In the logistic regression hypothesis h_theta(x) = g(theta' * x), the function g is *
Octave / Matlab provides other optimization algorithms such as Conjugate gradient, BFGS, and L-BFGS, but none of them are faster than gradient descent *
The following is the gradient descent algorithm for ... *
Captionless Image
For multi-class classification with N categories we simply train N binary classifiers and use the one with the maximal hypothesis value *
The logistic hypothesis h_theta(x) = g(theta' * x) represents *
Submit
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Additional Terms