Course Name : MACHINE LEARNING
Course Code : 20AD04
Course Instructor : Dr S Naganjaneyulu
Semester : VI
Regulation : R23
Unit: 4
1
UNIT-4: SYLLABUS
Linear Discriminants for Machine Learning: Introduction to Linear Discriminants, Linear Discriminants for Classification, Perceptron Classifier, Perceptron Learning Algorithm, Support Vector Machines, Linearly Non-Separable Case, Non-linear SVM, Kernel Trick, Logistic Regression, Linear Regression, Multi-Layer Perceptrons (MLPs), Backpropagation for Training an MLP.
Introduction to Linear Discriminants
4.3 CLASSIFICATION LEARNING STEPS
Introduction to Perceptron
STRUCTURE OF A PERCEPTRON
11 / 38
Step 1: Multiply all input values with corresponding weight values and then add to calculate the weighted sum.
Step 2: An activation function is applied with the above-mentioned weighted sum giving us an output either in binary form or a continuous value .
Basic Components of Perceptron:
Perceptron is a type of artificial neural network, which is a fundamental concept in machine learning. The basic components of a perceptron are:
Input Layer:
Hidden Layer:
Output Layer:
�
15
3/21/2026
Overall, the perceptron is a simple yet powerful algorithm that can be used to perform binary classification tasks and has paved the way for more complex neural networks used in deep learning today.
Types of Perceptron:
18
3/21/2026
Elements of Neural Networks (Cont’d)
…
bias
Activation function
weights
input
output
…
This is the simplest feedforward neural Network and does not contain any hidden layer, Which means it only consists of a single layer of output nodes. This is said to be single because when we count the layers we do not include the input layer, the reason for that is because at the input layer no computations is done, the inputs are fed directly to the outputs via a series of weights.
MULTI-LAYER PERCEPTRONS
21 / 38
▶
Inputs of a Perceptron:
23
3/21/2026
Output of Perceptron
24
3/21/2026
Forward propagation & Backward Propagation
Forward Propagation:
and is passed to next layer. The process continues until the o/p layer is reached.
25
3/21/2026
Backward Propagation
26
3/21/2026
Support Vector Machines
Key concepts of Support Vector Machines
Types of Support Vector Machines
.
Support Vector Machine Algorithm Steps
The basic steps of SVM are
.
Support Vector Machines Kernel
Kernel trick
Linear kernel: It is in the form
Polynomial kernel: It is in the form
Sigmoid kernel: It is in the form
Gaussian RBF kernel: It is in the form
Support Vector Machines
Classification using hyperplanes
FIG: Linearly separable data instances
Support Vector Machines
Classification using Hyperplanes
For example,
c0 + c1 X1 + c2 X2 = 0
c0 + c1 X1 + c2 X2 +……+ cn Xn= 0 (or)
Support Vector Machines
Identifying the correct hyperplane in SVM
Scenario 1:
Support vector machine: Scenario 1
Support Vector Machines
Identifying the correct hyperplane in SVM
Scenario 2:
Support vector machine: Scenario 2
Support Vector Machines
Identifying the correct hyperplane in SVM
Scenario 3:
Support vector machine: Scenario 3
Support Vector Machines Strengths and Weakness
Strengths of SVM:
Weaknesses of SVM
Support Vector Machines Applications
Applications of SVM:
Logistic Regression
Types of Logistic Regression
On the basis of the categories, Logistic Regression can be classified into three types:
o Binomial: In binomial Logistic regression, there can be only two possible types of the dependent variables, such as 0 or 1, Pass or Fail, etc.
o Multinomial: In multinomial Logistic regression, there can be 3 or more possible unordered types of the dependent variable, such as "cat", "dogs", or "sheep"
o Ordinal: In ordinal Logistic regression, there can be 3 or more possible ordered types of dependent variables, such as "low", "Medium", or "High".
Types of Linear Regression
Linear regression can be further divided into two types of the algorithm:
o Simple Linear Regression: If a single independent variable is used to predict the value of a numerical dependent variable, then such a Linear Regression algorithm is called Simple Linear Regression.
o Multiple Linear regression: If more than one independent variable is used to predict the value of a numerical dependent variable, then such a Linear Regression algorithm is called Multiple Linear Regression.
Types of Linear Regression
Linear Regression Line A linear line showing the relationship between the dependent and independent variables is called a regression line.
A regression line can show two types of relationship:
o Positive Linear Relationship: If the dependent variable increases on the Y-axis and independent variable increases on X axis, then such a relationship is termed as a Positive linear relationship.