SlideShare uma empresa Scribd logo
1 de 42
Partha pratim deb
        Mtech(cse)-1st year
Netaji subhash engineering college
•   Biological inspiration vs. artificial neural network
•   Why Use Neural Networks?
•   Neural network applications
•   Learning strategy & Learning techniques
•   Generalization types
•   Artificial neurons
•   MLP neural networks and tasks
•   learning mechanism used by multilayer perceptron
•   Activation functions
•   Multi-Layer Perceptron example for approximation
The McCullogh-Pitts
model

                      neurotransmission
Learning strategy
1.Supervised learning
2.Unsupervised learning
B
A           B
A
        B   A
A   B   A   B   B   B


            A       A
A   B   B       A
   It is based on a
    labeled training                                      ε Class


    set.
                                    ε Class
                                                  A
   The class of each                    B                     λ Class


    piece of data in
                          λ Class
                                                      B
    training set is                 A
    known.                                            A        ε Class


   Class labels are                λ Class   B
    pre-determined
    and provided in
    the training phase.
   Task performed         Task performed
      Classification          Clustering
      Pattern              NN Model :
    Recognition               Self Organizing
   NN model :                Maps
      Preceptron        “class of data is not
      Feed-forward NN      defined here”

“class of data is
   defined here”
1.Linear
2.Nonlinear
Nonlinear generalization of the McCullogh-Pitts
   neuron:
                                 1
                      y=                  sigmoidal neuron
y = f ( x, w)
                                   T
                                   −w x−a
                          1+ e
                             || x − w|| 2
                           −
                     y=e         2a 2       Gaussian neuron
MLP = multi-layer perceptron
Perceptron:
                                            yout = wT x       x   yout

MLP neural network:
                  1
 y1 =
  k              − w1 kT x − a1
                                  , k = 1,2,3
       1+ e                   k



 y 1 = ( y1 , y 1 , y3 ) T
          1
                2
                     1


                    1
 yk =
  2
                 − w 2 kT y 1 − a k
                                  2
                                      , k = 1,2
       1+ e
 y 2 = ( y12 , y 2 ) T
                 2
                                                                         yout
           2
                                                          x
 y out = ∑ wk y k = w3T y 2
            3 2

          k =1
• control
• classification            These can be reformulated
                            in general as
• prediction
                            FUNCTION
• approximation
                            APPROXIMATION
                             tasks.

Approximation: given a set of values of a function g(x)
build a neural network that approximates the g(x) values
for any input x.
Activation function used for curve the input data
             to know the variation
Sigmoidal (logistic) function-common in MLP
                           1                1
     g (ai (t )) =                    =
                   1 + exp(−k ai (t )) 1 + e −k ai ( t )

                                                   where k is a positive
                                                constant. The sigmoidal
                                                function gives a value in
                                                     range of 0 to 1.
                                                  Alternatively can use
                                                 tanh(ka) which is same
                                               shape but in range –1 to 1.

                                               Input-output function of a
                                                  neuron (rate coding
                                                      assumption)
Note: when net = 0, f = 0.5
Multi-Layer Perceptron example for approximation
Algorithm (sequential)

  1. Apply an input vector and calculate all activations, a and u
             2. Evaluate ∆k for all output units via:
     ∆ (t ) =( d i (t ) − yi (t )) g ' ( ai (t ))
      i


       (Note similarity to perceptron learning algorithm)
3. Backpropagate ∆ks to get error terms δ for hidden layers using:

   δ (t ) =g ' (ui (t ))∑ k (t ) wki
    i                    ∆
                                      k


          vij (t + 1) = vij (t ) + ηδ i (t ) x j (t )
           wij (t + 1) Evaluate ) + η∆i (t ) z j (t )
                    4. = w (t changes using:
                            ij
Here I have used simple identity activation function
with an example to understand how neural network
                      works
Once weight changes are computed for all units, weights are updated
  at the same time (bias included as weights here). An example:



                     v11= -1
       x1                                 w11= 1           y1
                    v21= 0             w21= -1
                 v12= 0
                                      w12= 0
      x2          v22= 1                                   y2
                                            w22= 1
                  v10= 1
                      v20= 1

                                Have input [0 1] with target [1 0].
                           Use identity activation function (ie g(a) = a)
All biases set to 1. Will not draw them for clarity.
                 Learning rate η = 0.1


               v11= -1
x1= 0                              w11= 1           y1
              v21= 0            w21= -1
           v12= 0
                               w12= 0
x2= 1       v22= 1                                  y2
                                      w22= 1

           Have input [0 1] with target [1 0].
Forward pass. Calculate 1st layer activations:




               v11= -1      u1 = 1
x1                                   w11= 1           y1
              v21= 0            w21= -1
           v12= 0
                               w12= 0
x2          v22= 1                                    y2
                                       w22= 1
                           u2 = 2

            u1 = -1x0 + 0x1 +1 = 1
            u2 = 0x0 + 1x1 +1 = 2
Calculate first layer outputs by passing activations thru activation
                              functions


                                z1 = 1
                     v11= -1
      x1                                    w11= 1        y1
                    v21= 0               w21= -1
                 v12= 0
                                      w12= 0
     x2           v22= 1                                  y2
                                              w22= 1
                                  z2 = 2


                           z1 = g(u1) = 1
                           z2 = g(u2) = 2
Calculate 2nd layer outputs (weighted sum thru activation functions):




                      v11= -1
       x1                                  w11= 1         y1= 2
                     v21= 0              w21= -1
                  v12= 0
                                       w12= 0
      x2           v22= 1                                 y2= 2
                                              w22= 1



                            y1 = a1 = 1x1 + 0x2 +1 = 2
                        y2 = a2 = -1x1 + 1x2 +1 = 2
Backward pass:




                     v11= -1
x1                                         w11= 1     ∆1= -1
                   v21= 0             w21= -1
                v12= 0
                                     w12= 0
x2               v22= 1                               ∆2= -2
                                             w22= 1

     Target =[1, 0] so d1 = 1 and d2 = 0
                    So:
        ∆ 1 = (d1 - y1 )= 1 – 2 = -1
        ∆   2   = (d2 - y2 )= 0 – 2 = -2
Calculate weight changes for 1st layer (cf perceptron learning):




                   v11= -1   z1 = 1
    x1                                  w11= 1       ∆1 z1 =-1
                  v21= 0              w21= -1      ∆1 z2 =-2
               v12= 0
                                   w12= 0
   x2           v22= 1                             ∆2 z1 =-2
                                          w22= 1      ∆2 z2 =-4
                               z2 = 2
Weight changes will be:




         v11= -1
x1                        w11= 0.9
        v21= 0          w21= -1.2
     v12= 0
                       w12= -0.2
x2    v22= 1
                             w22= 0.6
But first must calculate δ’s:




         v11= -1
x1                           ∆ 1 w11= -1    ∆1= -1
        v21= 0
                              ∆ 2 w21= 2
     v12= 0                   ∆ 1 w12= 0
x2    v22= 1                                ∆2= -2
                              ∆ 2 w22= -2
∆’s propagate back:




         v11= -1        δ 1= 1
x1                                           ∆1= -1
        v21= 0
     v12= 0
x2    v22= 1                                 ∆2= -2

                     δ 2 = -2

                                 δ1 = - 1 + 2 = 1
                                 δ2 = 0 – 2 = -2
And are multiplied by inputs:




             v11= -1       δ 1 x1 = 0
x1= 0
                                          ∆1= -1
            v21= 0       δ 1 x2 = 1
         v12= 0
                        δ 2 x1 = 0
 x2= 1    v22= 1                          ∆2= -2

                        δ 2 x2 = -2
Finally change weights:




x1= 0            v11= -1
                                   w11= 0.9
                v21= 0          w21= -1.2
             v12= 0.1
                               w12= -0.2
 x2= 1        v22= 0.8
                                     w22= 0.6


Note that the weights multiplied by the zero input are
  unchanged as they do not contribute to the error
        We have also changed biases (not shown)
Now go forward again (would normally use a new input vector):




                   v11= -1   z1 = 1.2
   x1= 0                                w11= 0.9
                  v21= 0           w21= -1.2
               v12= 0.1
                                  w12= -0.2
    x2= 1       v22= 0.8
                                          w22= 0.6
                              z2 = 1.6
Now go forward again (would normally use a new input vector):




   x1= 0           v11= -1                              y1 = 1.66
                                      w11= 0.9
                  v21= 0            w21= -1.2
               v12= 0.1
                                  w12= -0.2
    x2= 1       v22= 0.8
                                         w22= 0.6
                                                        y2 = 0.32


            Outputs now closer to target value [1, 0]
Neural network applications
         Pattern Classification
         Applications examples
• Remote Sensing and image classification
• Handwritten character/digits Recognition
                                   Control, Time series, Estimation
                                • Machine Control/Robot manipulation
                            • Financial/Scientific/Engineering Time series
          Optimization
                                              forecasting.
     • Traveling sales person
Multiprocessor scheduling and task
                                      Real World Application Examples
           assignment
                                        • Hospital patient stay length
                                                  prediction
                                        • Natural gas price prediction
• Artificial neural networks are inspired by the learning
processes that take place in biological systems.
• Learning can be perceived as an optimisation process.
• Biological neural learning happens by the modification
of the synaptic strength. Artificial neural networks learn
in the same way.
• The synapse strength modification rules for artificial
neural networks can be derived by applying
mathematical optimisation methods.
• Learning tasks of artificial neural networks = function
approximation tasks.
• The optimisation is done with respect to the approximation
error measure.
• In general it is enough to have a single hidden layer neural
network (MLP, RBF or other) to learn the approximation of
a nonlinear function. In such cases general optimisation can
be applied to find the change rules for the synaptic weights.
1.artificial neural network,simon haykin
2.artificial neural network , yegnanarayana
3.artificial neural network , zurada
4. Hornick, Stinchcombe and White’s conclusion (1989)
Hornik K., Stinchcombe M. and White
H., “Multilayer feedforward networks are universal
approximators”, Neural Networks, vol. 2,
no. 5,pp. 359–366, 1989
5. Kumar, P. and Walia, E., (2006), “Cash Forecasting: An
Application of Artificial Neural
Networks in Finance”, International Journal of Computer
Science and Applications 3 (1): 61-
77.
Neural network and mlp

Mais conteúdo relacionado

Mais procurados

Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural NetworksArslan Zulfiqar
 
Neural network
Neural networkNeural network
Neural networkSilicon
 
Neural networks
Neural networksNeural networks
Neural networksSlideshare
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPratik Aggarwal
 
Neural networks
Neural networksNeural networks
Neural networksSlideshare
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkDEEPASHRI HK
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkKnoldus Inc.
 
Dr. kiani artificial neural network lecture 1
Dr. kiani artificial neural network lecture 1Dr. kiani artificial neural network lecture 1
Dr. kiani artificial neural network lecture 1Parinaz Faraji
 
Neural network 20161210_jintaekseo
Neural network 20161210_jintaekseoNeural network 20161210_jintaekseo
Neural network 20161210_jintaekseoJinTaek Seo
 
Neural networks1
Neural networks1Neural networks1
Neural networks1Mohan Raj
 
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSArtificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSMohammed Bennamoun
 
Neural network
Neural networkNeural network
Neural networkmarada0033
 
Artificial Neural Network
Artificial Neural Network Artificial Neural Network
Artificial Neural Network Iman Ardekani
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Ann by rutul mehta
Ann by rutul mehtaAnn by rutul mehta
Ann by rutul mehtaRutul Mehta
 
Artificial Neural Networks for NIU
Artificial Neural Networks for NIUArtificial Neural Networks for NIU
Artificial Neural Networks for NIUProf. Neeta Awasthy
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networksarjitkantgupta
 
Learning in Networks: were Pavlov and Hebb right?
Learning in Networks: were Pavlov and Hebb right?Learning in Networks: were Pavlov and Hebb right?
Learning in Networks: were Pavlov and Hebb right?Victor Miagkikh
 

Mais procurados (20)

Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
Neural network
Neural networkNeural network
Neural network
 
Neural networks
Neural networksNeural networks
Neural networks
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks
 
Neural networks
Neural networksNeural networks
Neural networks
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Dr. kiani artificial neural network lecture 1
Dr. kiani artificial neural network lecture 1Dr. kiani artificial neural network lecture 1
Dr. kiani artificial neural network lecture 1
 
Neural network 20161210_jintaekseo
Neural network 20161210_jintaekseoNeural network 20161210_jintaekseo
Neural network 20161210_jintaekseo
 
Neural networks1
Neural networks1Neural networks1
Neural networks1
 
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNSArtificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
 
Neural network
Neural networkNeural network
Neural network
 
Artificial Neural Network
Artificial Neural Network Artificial Neural Network
Artificial Neural Network
 
Hopfield Networks
Hopfield NetworksHopfield Networks
Hopfield Networks
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Ann by rutul mehta
Ann by rutul mehtaAnn by rutul mehta
Ann by rutul mehta
 
Artificial Neural Networks for NIU
Artificial Neural Networks for NIUArtificial Neural Networks for NIU
Artificial Neural Networks for NIU
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
Learning in Networks: were Pavlov and Hebb right?
Learning in Networks: were Pavlov and Hebb right?Learning in Networks: were Pavlov and Hebb right?
Learning in Networks: were Pavlov and Hebb right?
 

Destaque

Analysis and applications of artificial neural networks
Analysis and applications of artificial neural networksAnalysis and applications of artificial neural networks
Analysis and applications of artificial neural networksSnehil Rastogi
 
Multilayerity within multilayerity? On multilayer assortativity in social net...
Multilayerity within multilayerity? On multilayer assortativity in social net...Multilayerity within multilayerity? On multilayer assortativity in social net...
Multilayerity within multilayerity? On multilayer assortativity in social net...Moses Boudourides
 
Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...
Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...
Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...Guillaume Dumas
 
Neural tool box
Neural tool boxNeural tool box
Neural tool boxMohan Raj
 
Neural Network Toolbox MATLAB
Neural Network Toolbox MATLABNeural Network Toolbox MATLAB
Neural Network Toolbox MATLABESCOM
 
Neural network in matlab
Neural network in matlab Neural network in matlab
Neural network in matlab Fahim Khan
 
Induction motor modelling and applications
Induction motor modelling and applicationsInduction motor modelling and applications
Induction motor modelling and applicationsUmesh Dadde
 
Neural Network Based Brain Tumor Detection using MR Images
Neural Network Based Brain Tumor Detection using MR ImagesNeural Network Based Brain Tumor Detection using MR Images
Neural Network Based Brain Tumor Detection using MR ImagesAisha Kalsoom
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationMohammed Bennamoun
 
Induction Motor Tests Using MATLAB/Simulink
Induction Motor Tests Using MATLAB/SimulinkInduction Motor Tests Using MATLAB/Simulink
Induction Motor Tests Using MATLAB/SimulinkGirish Gupta
 
Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Muhammad Ishaq
 
Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...
Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...
Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...aferrandini
 
Anna university-ug-pg-ppt-presentation-format
Anna university-ug-pg-ppt-presentation-formatAnna university-ug-pg-ppt-presentation-format
Anna university-ug-pg-ppt-presentation-formatVeera Victory
 
Automatic irrigation 1st review(ieee project ece dept)
Automatic irrigation 1st review(ieee project ece dept)Automatic irrigation 1st review(ieee project ece dept)
Automatic irrigation 1st review(ieee project ece dept)Siddappa Dollin
 
Fuzzy Logic and Neural Network
Fuzzy Logic and Neural NetworkFuzzy Logic and Neural Network
Fuzzy Logic and Neural NetworkSHIMI S L
 

Destaque (20)

MNN
MNNMNN
MNN
 
Analysis and applications of artificial neural networks
Analysis and applications of artificial neural networksAnalysis and applications of artificial neural networks
Analysis and applications of artificial neural networks
 
Multi Layer Network
Multi Layer NetworkMulti Layer Network
Multi Layer Network
 
Multilayerity within multilayerity? On multilayer assortativity in social net...
Multilayerity within multilayerity? On multilayer assortativity in social net...Multilayerity within multilayerity? On multilayer assortativity in social net...
Multilayerity within multilayerity? On multilayer assortativity in social net...
 
L005.neural networks
L005.neural networksL005.neural networks
L005.neural networks
 
Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...
Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...
Whole Brain Simulations and the Discrepancy/Similarity between Artificial & N...
 
Neural tool box
Neural tool boxNeural tool box
Neural tool box
 
Neural Network Toolbox MATLAB
Neural Network Toolbox MATLABNeural Network Toolbox MATLAB
Neural Network Toolbox MATLAB
 
Neural network in matlab
Neural network in matlab Neural network in matlab
Neural network in matlab
 
Induction motor modelling and applications
Induction motor modelling and applicationsInduction motor modelling and applications
Induction motor modelling and applications
 
Neural Network Based Brain Tumor Detection using MR Images
Neural Network Based Brain Tumor Detection using MR ImagesNeural Network Based Brain Tumor Detection using MR Images
Neural Network Based Brain Tumor Detection using MR Images
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
 
Induction Motor Tests Using MATLAB/Simulink
Induction Motor Tests Using MATLAB/SimulinkInduction Motor Tests Using MATLAB/Simulink
Induction Motor Tests Using MATLAB/Simulink
 
Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...
 
Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...
Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...
Artificial Neural Network in a Tic Tac Toe Symfony Console Application - Symf...
 
Multidimensional RNN
Multidimensional RNNMultidimensional RNN
Multidimensional RNN
 
Anna university-ug-pg-ppt-presentation-format
Anna university-ug-pg-ppt-presentation-formatAnna university-ug-pg-ppt-presentation-format
Anna university-ug-pg-ppt-presentation-format
 
Automatic irrigation 1st review(ieee project ece dept)
Automatic irrigation 1st review(ieee project ece dept)Automatic irrigation 1st review(ieee project ece dept)
Automatic irrigation 1st review(ieee project ece dept)
 
Fuzzy Logic and Neural Network
Fuzzy Logic and Neural NetworkFuzzy Logic and Neural Network
Fuzzy Logic and Neural Network
 
Google driverless cars
Google driverless carsGoogle driverless cars
Google driverless cars
 

Semelhante a Neural network and mlp

latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxMdMahfoozAlam5
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkIldar Nurgaliev
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptronESCOM
 
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Beniamino Murgante
 
Least squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierLeast squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierRaj Sikarwar
 
Introduction to Neural Netwoks
Introduction to Neural Netwoks Introduction to Neural Netwoks
Introduction to Neural Netwoks Abdallah Bashir
 
Varian, microeconomic analysis, solution book
Varian, microeconomic analysis, solution bookVarian, microeconomic analysis, solution book
Varian, microeconomic analysis, solution bookJosé Antonio PAYANO YALE
 
02 2d systems matrix
02 2d systems matrix02 2d systems matrix
02 2d systems matrixRumah Belajar
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer visionzukun
 
Gaussian Integration
Gaussian IntegrationGaussian Integration
Gaussian IntegrationReza Rahimi
 
Lect4 ellipse
Lect4 ellipseLect4 ellipse
Lect4 ellipseBCET
 
Dsp U Lec04 Discrete Time Signals & Systems
Dsp U   Lec04 Discrete Time Signals & SystemsDsp U   Lec04 Discrete Time Signals & Systems
Dsp U Lec04 Discrete Time Signals & Systemstaha25
 
Inverse circular function
Inverse circular functionInverse circular function
Inverse circular functionAPEX INSTITUTE
 

Semelhante a Neural network and mlp (20)

latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptx
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptron
 
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
 
Least squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierLeast squares support Vector Machine Classifier
Least squares support Vector Machine Classifier
 
Introduction to Neural Netwoks
Introduction to Neural Netwoks Introduction to Neural Netwoks
Introduction to Neural Netwoks
 
5.n nmodels i
5.n nmodels i5.n nmodels i
5.n nmodels i
 
Varian, microeconomic analysis, solution book
Varian, microeconomic analysis, solution bookVarian, microeconomic analysis, solution book
Varian, microeconomic analysis, solution book
 
02 2d systems matrix
02 2d systems matrix02 2d systems matrix
02 2d systems matrix
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer vision
 
EE658_Lecture_8.pdf
EE658_Lecture_8.pdfEE658_Lecture_8.pdf
EE658_Lecture_8.pdf
 
Taylor problem
Taylor problemTaylor problem
Taylor problem
 
Gaussian Integration
Gaussian IntegrationGaussian Integration
Gaussian Integration
 
Lect4 ellipse
Lect4 ellipseLect4 ellipse
Lect4 ellipse
 
Neural network
Neural networkNeural network
Neural network
 
Assignment6
Assignment6Assignment6
Assignment6
 
Shape1 d
Shape1 dShape1 d
Shape1 d
 
Dsp U Lec04 Discrete Time Signals & Systems
Dsp U   Lec04 Discrete Time Signals & SystemsDsp U   Lec04 Discrete Time Signals & Systems
Dsp U Lec04 Discrete Time Signals & Systems
 
CS767_Lecture_04.pptx
CS767_Lecture_04.pptxCS767_Lecture_04.pptx
CS767_Lecture_04.pptx
 
Inverse circular function
Inverse circular functionInverse circular function
Inverse circular function
 

Último

How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Orbitshub
 
Elevate Developer Efficiency & build GenAI Application with Amazon Q​
Elevate Developer Efficiency & build GenAI Application with Amazon Q​Elevate Developer Efficiency & build GenAI Application with Amazon Q​
Elevate Developer Efficiency & build GenAI Application with Amazon Q​Bhuvaneswari Subramani
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businesspanagenda
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Victor Rentea
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Jeffrey Haguewood
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontologyjohnbeverley2021
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Zilliz
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 

Último (20)

How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Elevate Developer Efficiency & build GenAI Application with Amazon Q​
Elevate Developer Efficiency & build GenAI Application with Amazon Q​Elevate Developer Efficiency & build GenAI Application with Amazon Q​
Elevate Developer Efficiency & build GenAI Application with Amazon Q​
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 

Neural network and mlp

  • 1. Partha pratim deb Mtech(cse)-1st year Netaji subhash engineering college
  • 2. Biological inspiration vs. artificial neural network • Why Use Neural Networks? • Neural network applications • Learning strategy & Learning techniques • Generalization types • Artificial neurons • MLP neural networks and tasks • learning mechanism used by multilayer perceptron • Activation functions • Multi-Layer Perceptron example for approximation
  • 3. The McCullogh-Pitts model neurotransmission
  • 4.
  • 7. B A B A B A
  • 8. A B A B B B A A A B B A
  • 9. It is based on a labeled training ε Class set. ε Class A  The class of each B λ Class piece of data in λ Class B training set is A known. A ε Class  Class labels are λ Class B pre-determined and provided in the training phase.
  • 10. Task performed  Task performed Classification Clustering Pattern  NN Model : Recognition Self Organizing  NN model : Maps Preceptron “class of data is not Feed-forward NN defined here” “class of data is defined here”
  • 12.
  • 13.
  • 14. Nonlinear generalization of the McCullogh-Pitts neuron: 1 y= sigmoidal neuron y = f ( x, w) T −w x−a 1+ e || x − w|| 2 − y=e 2a 2 Gaussian neuron
  • 15.
  • 16. MLP = multi-layer perceptron Perceptron: yout = wT x x yout MLP neural network: 1 y1 = k − w1 kT x − a1 , k = 1,2,3 1+ e k y 1 = ( y1 , y 1 , y3 ) T 1 2 1 1 yk = 2 − w 2 kT y 1 − a k 2 , k = 1,2 1+ e y 2 = ( y12 , y 2 ) T 2 yout 2 x y out = ∑ wk y k = w3T y 2 3 2 k =1
  • 17. • control • classification These can be reformulated in general as • prediction FUNCTION • approximation APPROXIMATION tasks. Approximation: given a set of values of a function g(x) build a neural network that approximates the g(x) values for any input x.
  • 18.
  • 19. Activation function used for curve the input data to know the variation
  • 20. Sigmoidal (logistic) function-common in MLP 1 1 g (ai (t )) = = 1 + exp(−k ai (t )) 1 + e −k ai ( t ) where k is a positive constant. The sigmoidal function gives a value in range of 0 to 1. Alternatively can use tanh(ka) which is same shape but in range –1 to 1. Input-output function of a neuron (rate coding assumption) Note: when net = 0, f = 0.5
  • 21. Multi-Layer Perceptron example for approximation
  • 22. Algorithm (sequential) 1. Apply an input vector and calculate all activations, a and u 2. Evaluate ∆k for all output units via: ∆ (t ) =( d i (t ) − yi (t )) g ' ( ai (t )) i (Note similarity to perceptron learning algorithm) 3. Backpropagate ∆ks to get error terms δ for hidden layers using: δ (t ) =g ' (ui (t ))∑ k (t ) wki i ∆ k vij (t + 1) = vij (t ) + ηδ i (t ) x j (t ) wij (t + 1) Evaluate ) + η∆i (t ) z j (t ) 4. = w (t changes using: ij
  • 23. Here I have used simple identity activation function with an example to understand how neural network works
  • 24. Once weight changes are computed for all units, weights are updated at the same time (bias included as weights here). An example: v11= -1 x1 w11= 1 y1 v21= 0 w21= -1 v12= 0 w12= 0 x2 v22= 1 y2 w22= 1 v10= 1 v20= 1 Have input [0 1] with target [1 0]. Use identity activation function (ie g(a) = a)
  • 25. All biases set to 1. Will not draw them for clarity. Learning rate η = 0.1 v11= -1 x1= 0 w11= 1 y1 v21= 0 w21= -1 v12= 0 w12= 0 x2= 1 v22= 1 y2 w22= 1 Have input [0 1] with target [1 0].
  • 26. Forward pass. Calculate 1st layer activations: v11= -1 u1 = 1 x1 w11= 1 y1 v21= 0 w21= -1 v12= 0 w12= 0 x2 v22= 1 y2 w22= 1 u2 = 2 u1 = -1x0 + 0x1 +1 = 1 u2 = 0x0 + 1x1 +1 = 2
  • 27. Calculate first layer outputs by passing activations thru activation functions z1 = 1 v11= -1 x1 w11= 1 y1 v21= 0 w21= -1 v12= 0 w12= 0 x2 v22= 1 y2 w22= 1 z2 = 2 z1 = g(u1) = 1 z2 = g(u2) = 2
  • 28. Calculate 2nd layer outputs (weighted sum thru activation functions): v11= -1 x1 w11= 1 y1= 2 v21= 0 w21= -1 v12= 0 w12= 0 x2 v22= 1 y2= 2 w22= 1 y1 = a1 = 1x1 + 0x2 +1 = 2 y2 = a2 = -1x1 + 1x2 +1 = 2
  • 29. Backward pass: v11= -1 x1 w11= 1 ∆1= -1 v21= 0 w21= -1 v12= 0 w12= 0 x2 v22= 1 ∆2= -2 w22= 1 Target =[1, 0] so d1 = 1 and d2 = 0 So: ∆ 1 = (d1 - y1 )= 1 – 2 = -1 ∆ 2 = (d2 - y2 )= 0 – 2 = -2
  • 30. Calculate weight changes for 1st layer (cf perceptron learning): v11= -1 z1 = 1 x1 w11= 1 ∆1 z1 =-1 v21= 0 w21= -1 ∆1 z2 =-2 v12= 0 w12= 0 x2 v22= 1 ∆2 z1 =-2 w22= 1 ∆2 z2 =-4 z2 = 2
  • 31. Weight changes will be: v11= -1 x1 w11= 0.9 v21= 0 w21= -1.2 v12= 0 w12= -0.2 x2 v22= 1 w22= 0.6
  • 32. But first must calculate δ’s: v11= -1 x1 ∆ 1 w11= -1 ∆1= -1 v21= 0 ∆ 2 w21= 2 v12= 0 ∆ 1 w12= 0 x2 v22= 1 ∆2= -2 ∆ 2 w22= -2
  • 33. ∆’s propagate back: v11= -1 δ 1= 1 x1 ∆1= -1 v21= 0 v12= 0 x2 v22= 1 ∆2= -2 δ 2 = -2 δ1 = - 1 + 2 = 1 δ2 = 0 – 2 = -2
  • 34. And are multiplied by inputs: v11= -1 δ 1 x1 = 0 x1= 0 ∆1= -1 v21= 0 δ 1 x2 = 1 v12= 0 δ 2 x1 = 0 x2= 1 v22= 1 ∆2= -2 δ 2 x2 = -2
  • 35. Finally change weights: x1= 0 v11= -1 w11= 0.9 v21= 0 w21= -1.2 v12= 0.1 w12= -0.2 x2= 1 v22= 0.8 w22= 0.6 Note that the weights multiplied by the zero input are unchanged as they do not contribute to the error We have also changed biases (not shown)
  • 36. Now go forward again (would normally use a new input vector): v11= -1 z1 = 1.2 x1= 0 w11= 0.9 v21= 0 w21= -1.2 v12= 0.1 w12= -0.2 x2= 1 v22= 0.8 w22= 0.6 z2 = 1.6
  • 37. Now go forward again (would normally use a new input vector): x1= 0 v11= -1 y1 = 1.66 w11= 0.9 v21= 0 w21= -1.2 v12= 0.1 w12= -0.2 x2= 1 v22= 0.8 w22= 0.6 y2 = 0.32 Outputs now closer to target value [1, 0]
  • 38. Neural network applications Pattern Classification Applications examples • Remote Sensing and image classification • Handwritten character/digits Recognition Control, Time series, Estimation • Machine Control/Robot manipulation • Financial/Scientific/Engineering Time series Optimization forecasting. • Traveling sales person Multiprocessor scheduling and task Real World Application Examples assignment • Hospital patient stay length prediction • Natural gas price prediction
  • 39. • Artificial neural networks are inspired by the learning processes that take place in biological systems. • Learning can be perceived as an optimisation process. • Biological neural learning happens by the modification of the synaptic strength. Artificial neural networks learn in the same way. • The synapse strength modification rules for artificial neural networks can be derived by applying mathematical optimisation methods.
  • 40. • Learning tasks of artificial neural networks = function approximation tasks. • The optimisation is done with respect to the approximation error measure. • In general it is enough to have a single hidden layer neural network (MLP, RBF or other) to learn the approximation of a nonlinear function. In such cases general optimisation can be applied to find the change rules for the synaptic weights.
  • 41. 1.artificial neural network,simon haykin 2.artificial neural network , yegnanarayana 3.artificial neural network , zurada 4. Hornick, Stinchcombe and White’s conclusion (1989) Hornik K., Stinchcombe M. and White H., “Multilayer feedforward networks are universal approximators”, Neural Networks, vol. 2, no. 5,pp. 359–366, 1989 5. Kumar, P. and Walia, E., (2006), “Cash Forecasting: An Application of Artificial Neural Networks in Finance”, International Journal of Computer Science and Applications 3 (1): 61- 77.