Page 1 of 9

www.AssignmentPoint.com

Linear Discriminant Analysis

www.AssignmentPoint.com

Page 2 of 9

www.AssignmentPoint.com

Linear Discriminant Analysis (LDA) is a generalization of Fisher's linear discriminant, a

method used in statistics, pattern recognition and machine learning to find a linear

combination of features that characterizes or separates two or more classes of objects or

events. The resulting combination may be used as a linear classifier, or, more commonly, for

dimensionality reduction before later classification.

LDA is closely related to analysis of variance (ANOVA) and regression analysis, which also

attempt to express one dependent variable as a linear combination of other features or

measurements. However, ANOVA uses categorical independent variables and a continuous

dependent variable, whereas discriminant analysis has continuous independent variables and

a categorical dependent variable (i.e. the class label). Logistic regression and probit

regression are more similar to LDA than ANOVA is, as they also explain a categorical

variable by the values of continuous independent variables. These other methods are

preferable in applications where it is not reasonable to assume that the independent variables

are normally distributed, which is a fundamental assumption of the LDA method.

LDA is also closely related to principal component analysis (PCA) and factor analysis in that

they both look for linear combinations of variables which best explain the data. LDA

explicitly attempts to model the difference between the classes of data. PCA on the other

hand does not take into account any difference in class, and factor analysis builds the feature

combinations based on differences rather than similarities. Discriminant analysis is also

different from factor analysis in that it is not an interdependence technique: a distinction

between independent variables and dependent variables (also called criterion variables) must

be made.

LDA works when the measurements made on independent variables for each observation are

continuous quantities. When dealing with categorical independent variables, the equivalent

technique is discriminant correspondence analysis.

LDA for two classes

Consider a set of observations (also called features, attributes, variables or measurements)

for each sample of an object or event with known class y. This set of samples is called the

training set. The classification problem is then to find a good predictor for the class y of any

sample of the same distribution (not necessarily from the training set) given only an

observation .

Page 3 of 9

www.AssignmentPoint.com

LDA approaches the problem by assuming that the conditional probability density functions

and are both normally distributed with mean and covariance

parameters and , respectively. Under this assumption, the Bayes optimal

solution is to predict points as being from the second class if the log of the likelihood ratios is

below some threshold T, so that:

Without any further assumptions, the resulting classifier is referred to as QDA (quadratic

discriminant analysis).

LDA instead makes the additional simplifying homoscedasticity assumption (i.e. that the

class covariances are identical, so and that the covariances have full rank. In

this case, several terms cancel:

because is Hermitian

and the above decision criterion becomes a threshold on the dot product

for some threshold constant c, where

This means that the criterion of an input \vec x being in a class y is purely a function of this

linear combination of the known observations.

It is often useful to see this conclusion in geometrical terms: the criterion of an input

being in a class y is purely a function of projection of multidimensional-space point onto

vector (thus, we only consider its direction). In other words, the observation belongs to y

if corresponding is located on a certain side of a hyperplane perpendicular to . The

location of the plane is defined by the threshold c.

Canonical discriminant analysis for k classes

Canonical discriminant analysis (CDA) finds axes (k - 1 canonical coordinates, k being the

number of classes) that best separate the categories. These linear functions are uncorrelated