1 of 24

Data Mining_Anoop Chaturvedi

1

Swayam Prabha

Course Title

Multivariate Data Mining- Methods and Applications

Lecture 21

McCulloch- Pitts Neuron and Single-Layer Perceptron

By

Anoop Chaturvedi

Department of Statistics, University of Allahabad

Prayagraj (India)

Slides can be downloaded from https://sites.google.com/view/anoopchaturvedi/swayam-prabha

2 of 24

  •  

Data Mining_Anoop Chaturvedi

2

3 of 24

  •  

Data Mining_Anoop Chaturvedi

3

4 of 24

  •  

Data Mining_Anoop Chaturvedi

4

Σ

θ

 

 

 

5 of 24

  •  

Data Mining_Anoop Chaturvedi

5

 

 

 

 

 

 

6 of 24

 

Data Mining_Anoop Chaturvedi

6

7 of 24

  •  

Data Mining_Anoop Chaturvedi

7

8 of 24

  •  

Data Mining_Anoop Chaturvedi

8

9 of 24

Limitations of McCullock-Pitts neuron

  • Can’t be used if inputs are non-boolean (say, real).
  • All inputs may not be equally important. One may want to assign more importance to some inputs.
  • Can’t be used if functions are not linearly separable.

Data Mining_Anoop Chaturvedi

9

10 of 24

Hebb Learning Rule of Neuron Excitation:

  • When an axon of cell A is near enough to excite cell B and repeatedly takes part in firing it, some growth process takes place in one or both cells so that A’s efficiency, as one of the cells firing B, is increased.
  • Strength of a synaptic connection between two neurons depend upon their associated firing history.
  • More often two neurons fire together, the stronger the connection.

Data Mining_Anoop Chaturvedi

10

11 of 24

Neural Inhibitory Rule:

  • If A persistently sends signal to B but B does not fire, thus reduces the chance that future signal from A will excite B to fire.
  • Inhibitory rule ensures that system of synaptic connections throughout the cerebral cortex would not grow without limit as soon as one such connection is activated.

Data Mining_Anoop Chaturvedi

11

12 of 24

  •  

Data Mining_Anoop Chaturvedi

12

13 of 24

  •  

Data Mining_Anoop Chaturvedi

13

14 of 24

  •  

Data Mining_Anoop Chaturvedi

14

15 of 24

  •  

Data Mining_Anoop Chaturvedi

15

16 of 24

Different types of Artificial Neural Networks

Feedforward Neural Network

  • Input enters through the input layer and exits through the output layer.
  • It has a forward-propagated wave only and does not have backpropagation.

Forward propagation ⇒ Process of computing the output given an input

Backpropagation ⇒ Process of updating the weights based on the error between the predicted output and the actual output.

Data Mining_Anoop Chaturvedi

16

17 of 24

Convolutional Neural Network

  • Has one or more than one convolutional layer. Uses a convolution operation on the input and then passes the result to the next layer.
  • Applications in speech and image processing, and computer vision.

Recurrent Neural Network

  • Saves the output of a layer and feeds it back to the input to better predict the outcome of the layer.
  • The output of the first layer is computed as a feed-forward neural network. After the first layer, each unit remembers information from the previous step (acts as a memory cell in performing computations).

Data Mining_Anoop Chaturvedi

17

18 of 24

Radial basis function Neural Network

Three layers architecture:

  1. Input layer ⇒ receives input data
  2. Hidden layer ⇒ Input data is passed into the hidden layer with a nonlinear activation function, where the computation occurs.
  3. Output layer ⇒ Performs prediction tasks such as classification or regression.

Data Mining_Anoop Chaturvedi

18

19 of 24

  •  

Data Mining_Anoop Chaturvedi

19

20 of 24

  •  

Data Mining_Anoop Chaturvedi

20

21 of 24

  •  

Data Mining_Anoop Chaturvedi

21

RBF

Input

 

Output

 

22 of 24

RBFN has an easy design and provides a good generalization.

It is faster to train.

It has a straightforward interpretation of the functioning of each node in the hidden layer.

Data Mining_Anoop Chaturvedi

22

23 of 24

Modular Neural Network:

  • Break down a large and complex computational process into smaller components. Distributes computation across multiple modules or processing units.
  • Contains a collection of different neural networks working independently with no interaction towards obtaining the output.
  • Each of the different neural networks performs a different sub-task by obtaining unique inputs compared to other networks.

Data Mining_Anoop Chaturvedi

23

24 of 24

Modular Neural Network:

Data Mining_Anoop Chaturvedi

24