APPLICATION OF SOFT COMPUTING
Vinay Pratap Singh
ACTIVATION FUNCTION
Activation for Hidden Layers
Rectified Linear Activation (ReLU)
Continue...
Program
def ReLU(x):� if x>0:� return x� else: � return 0
def relu(x):� return max(0.0, x)
Logistic (Sigmoid)
Continue...
Activation for Output Layers
�Continue...
�Activation for Output Layers
Continue...
How to choose an output layer activation function
Binary ,Multiclass, Multilabel Classification
Single layer feed forward neural network
Continue...
https://www.renom.jp/notebooks/tutorial/beginners_guide/feedforward_example_2/notebook.html
Multilayer feed forward neural network
Disadvantage of FFNN
Recurrent neural network
Continue...
Ho = initial hidden value.
Continue...