SlideShare uma empresa Scribd logo
1 de 47
-Neural network was inspired by the design and functioning of
human brain and components.
-Definition:
-Information processing model that is inspired by the way
biological nervous system (i.e) the brain, process information.
-ANN is composed of large number of highly interconnected
processing elements(neurons) working in unison to solve
problems.
-It is configured for special application such as pattern recognition
and data classification through a learning process.
-85-90% accurate.




                                                                        1
AdvAntAges of neurAl networks
• A Neural Network can be an “expert”           in analyzing the category of
    information given to it.
•   Answers “ what-if” questions
•   Adaptive learning
    – Ability to learn how to do tasks based on the data given for training or
       initial experience.
•   Self organization
    – Creates its own organization or representation of information it receives
       during learning time.
•   Real time operation
    – Computations can be carried out in parallel.
•   Fault tolerance via redundant information coding
    – Partial destruction of neural network cause degradation of performance.
    – In some cases, it can be retained even after major network damage.
•   In future, it can also used to give spoken words as instructions for
    machine.

                                                                                  2
This figure shows the
 multi disciplinary point of
 view of Neural Networks




                               3
ApplicAtion scope of
neurAl networks
   Air traffic control
   Animal behavior
   Appraisal and valuation of property, etc.,
   Betting on horse races, stock markets
   Criminal sentencing
   Complex physical and chemical process
   Data mining, cleaning and validation
   Direct mail advertisers
   Echo patterns
   Economic modeling
   Employee hiring
   Expert consulatants
   Fraud detection
   Hand writing and typewriting
   Lake water levels
   Machinery controls
   Medical diagnosis
   Music composition
   Photos and finger prints
   Recipes and chemical formulation
   Traffic flows
   Weather prediction



                                                 4
fuzzy logic
 Lofti Zadeh, Professor at University of California.
 An organized method for dealing with imprecise data
 Fuzzy logic includes 0 and 1 as extreme cases of truth (or "the state of matters" or "fact") but
    also includes the various states of truth in between so that, for example, the result of a
    comparison between two things could be not "tall" or "short" but ".38 of tallness.“
   Allows partial membership
   Implemented in small, embedded micro controllers to large , networked, multichannel PC or
    work station.
   Can be implemented in hardware, software or in both.
   It mimics how a person would make decisions.




                                                                                                     5
genetic Algorithm
 How genes of parents combine to form those of their
    children.
   Create an initial population of individuals representing
    possible solutions to solve a problem
   Individual characters determine whether they are less or
    more fit to the population
   The more fit members will take high probability.
   It is very effective in finding optimal or near optimal
    solutions.
   Generate and test strategy.
   Differ from normal optimization and search procedures in:
       Work with coding of the parameter set
       Work with multiple points
       Search via sampling( a blind search)
       Search using stochastic opeartors
   In business, scientific and engineering circles, etc.,




                                                                6
hybrid system
 Three types
 Neuro Fuzzy hybrid system
    Combination of fuzzy set theory and neural networks
    Fuzzy system deal with explicit knowledge that can be explained and
     understood
    Neural network deal with implicit knowledge acquired by learning
    Advantages are:
      Handle any kind of information

      Manage imprecise, partial, vague or imperfect information

      Resolve conflicts by collaboration and aggregation.

      Self learning, self organizing and self tuning capability

      No need for prior knowledge of relationship of data

      Mimic human decision making system


                                                                           7
contd..
 Neuron genetic hybrid system
    Topology optimization
         Select a topology for ANN , common one is back propagation
    Genetic algorithm training
         Learning of ANN is formulated ad weight optimization problem, usually mean squared error as a fitness measure
    Control parameter optimization
         Learning rate, momentum rate, tolerance level. Etc., are optimized using GA.

 Fuzzy genetic hybrid system
    Creating the classification rules for a fuzzy system where objects are classified by linguistic terms.




                                                                                                                          8
soft computing
             Two major problem solving techniques are:
                 Hard computing
                       Deals with precise model where
                        accurate solutions are achieved.
                   Soft computing
                     deals with approximate model to
                        give solution for complex problems
                   Prof. Lotfi Zadeh introduced it.
                   Ultimate goal-emulate the human mind
                   It is a combination of GA, Neural
                    Network and FL.




                                                             9
ArtificiAl neurAl network : An
           introduction
Resembles the characteristic of biological neural network.
Nodes – interconnected processing elements (units or neurons)
Neuron is connected to other by a connection link.
Each connection link is associated with weight which has
 information about the input signal.
ANN processing elements are called as neurons or artificial
 neurons , since they have the capability to model networks of
 original neurons as found in brain.
Internal state of neuron is called activation or activity level of
 neuron, which is the function of the inputs the neurons receives.
Neuron can send only one signal at a time.

                                                                  10
bAsic operAtion of A neurAl net
X1 and X2 – input
 neurons.
Y- output neuron
Weighted
 interconnection links-
 W1 and W2.
Net input calculation is :


Output is :


Output= function                  11
contd…
The function to be applied over the net input is called
 activation function.
Weight involved in ANN is equal to the slope of
 linear straight line (y=mx).




                                                       12
Biological Neural Network
 Has three main parts
     Soma or cell body-where cell nucleus is located
     Dendrites-where the nerve is connected to the cell body
     Axon-which carries the impulses of the neuron
 Electric impulse is passed between synapse and dendrites.
 Synapse- Axon split into strands and strands terminates into small bulb like organs
  called as synapse.
 It is a chemical process which results in increase /decrease in the electric potential
  inside the body of the receiving cell.
 If the electric potential reaches a thresh hold value, receiving cell fires & pulse /
  action potential of fixed strength and duration is send through the axon to synaptic
  junction of the cell.
 After that, cell has to wait for a period called refractory period.


                                                                                       13
coNtd..
In this model net input is calculated by




                                            14
Terminology Relation Between Biological And
             Artificial Neuron
Biological Neuron     Artificial Neuron
Cell                  Neuron
Dendrites             Weights or interconnections
Soma                  Net input
Axon                  Output




                                                    15
BraiN Vs computer
Term                  Brain                               Computer
Speed                 Execution time is few               Execution time is few nano
                      milliseconds                        seconds
Processing            Perform massive parallel            Perform several parallel
                      operations simultaneously           operations simultaneously. It is
                                                          faster the biological neuron
Size and complexity   Number of Neuron is 1011 and        It depends on the chosen
                      number of interconnections is       application and network
                      1015.                               designer.
                      So complexity of brain is
                      higher than computer

Storage capacity      i) Information is stored in         i)   Stored in continuous
                      interconnections or in synapse           memory location.
                      strength.                           ii) Overloading may destroy
                      ii) New information is stored            older locations.
                      without destroying old one.         iii) Can be easily retrieved
                      iii) Sometimes fails to recollect
                      information                                                        16
coNtd…
Tolerance           i) Fault tolerant           i) No fault tolerance
                    ii) Store and retrieve      ii) Information
                         information even            corrupted if the
                         interconnections            network
                         fails                       connections
                    iii) Accept                      disconnected.
                         redundancies           iii) No redundancies
Control mechanism   Depends on active           CPU
                    chemicals and neuron        Control mechanism is
                    connections are strong or   very simple
                    weak




                                                                        17
characteristics of aNN:
 Neurally implemented mathematical model
 Large number of processing elements called neurons exists here.
 Interconnections with weighted linkage hold informative knowledge.
 Input signals arrive at processing elements through connections and
  connecting weights.
 Processing elements can learn, recall and generalize from the given
  data.
 Computational power is determined by the collective behavior of
  neurons.
    ANN is a connection models, parallel distributed processing models, self-
     organizing systems, neuro-computing systems and neuro morphic system.



                                                                             18
eVolutioN of Neural
                 Networks
Year                    Neural network        Designer              Description
1943                    McCulloch and Pitts   McCulloch and Pitts   Arrangement of
                        neuron                                      neurons is
                                                                    combination of logic
                                                                    gate. Unique feature
                                                                    is thresh hold
1949                    Hebb network          Hebb                  If two neurons are
                                                                    active, then their
                                                                    connection strengths
                                                                    should be increased.
1958,1959,1962,1988,1   Perceptron            Frank Rosenblatt,     Weights are adjusted
960                     Adaline               Block, Minsky and     to reduce the
                                              Papert Widrow and     difference between
                                              Hoff                  the net input to the
                                                                    output unit and the
                                                                    desired output

                                                                                       19
coNtd…
1972    Kohonen self-      Kohonen             Inputs are
        organizing                             clustered to
        feature map                            obtain a fired
                                               output neuron.
1982,   Hopfield network   John Hopfield and   Based on fixed
1984,                      Tank                weights.
1985,                                          Can act as
1986,                                          associative
1987                                           memory nets

1986    Back propagation   Rumelhart,          i) Multilayered
        network            Hinton and          ii) Error
                           Williams                propagated
                                                   backward from
                                                   output to the
                                                   hidden units
                                                                20
coNtd..
1988        Counter            Grossberg       Similar to
            propagation                        kohonen
            network                            network
1987-1990   Adaptive           Carpenter and   Designed for
            resonance          Grossberg       binary and analog
            Theory(ART)                        inputs.
1988        Radial basis       Broomhead and   Resemble back
            function network   Lowe            propagation
                                               network , but
                                               activation function
                                               used is Gaussian
                                               function
1988        Neo cognitron      Fukushima       For character
                                               recogniton.

                                                                   21
Basic models of aNN
 Models are based on three entities
     The model’s synaptic interconnections.
     The training or learning rules adopted for updating and adjusting the connection
      weights.
    Their activation functions
 The arrangement of neurons to form layers and the connection pattern formed
  within and between layers is called the network architecture.
 Five types:
    Single layer feed forward network
    Multilayer feed-forward network
    Single node with its own feedback
    Single-layer recurrent network
    Multilayer recurrent network


                                                                                     22
Single layer Feed- Forward Network
Layer is formed by taking
 processing elements and
 combining it with other
 processing elements.
Input and output are
 linked with each other
Inputs are connected to
 the processing nodes with
 various weights, resulting
 in series of outputs one
 per node.

                                        23
Multilayer feed-forward network
 Formed by the interconnection of
  several layers.
 Input layer receives input and buffers
  input signal.
 Output layer generated output.
 Layer between input and output is
  called hidden layer.
 Hidden layer is internal to the
  network.
 Zero to several hidden layers in a
  network.
 More the hidden layer, more is the
  complexity of network, but efficient
  output is produced.

                                           24
Feed back network
 If no neuron in the output layer is an input
    to a node in the same layer / proceeding
    layer – feed forward network.
   If outputs are directed back as input to the
    processing elements in the same
    layer/proceeding layer –feedback network.
   If the output are directed back to the input
    of the same layer then it is lateral feedback.
   Recurrent networks are networks with
    feedback networks with closed loop.
   Fig 2.8 (A) –simple recurrent neural
    network having a single neuron with
    feedback to itself.
   Fig 2.9 – single layer network with
    feedback from output can be directed to
    processing element itself or to other
    processing element/both.
                                                     25
 Maxnet –competitive interconnections
  having fixed weights.
 On-center-off-surround/lateral
  inhibiton structure – each processing
  neuron receives two different classes
  of inputs- “excitatory” input from
  nearby processing elements & “
  inhibitory” elements from more
  distantly located precessing elements.
  This type of interconnection is shown
  below




                                           26
Processing element output
 can be directed back to the
 nodes in the preceding layer,
 forming      a     multilayer
 recurrent network.
Processing element output
 can be directed to processing
 element itself or to other
 processing element in the
 same layer.



                             27
learning
Two broad kinds of learning in ANNs is :
i) parameter learning – updates connecting weights in a
 neural net.
ii) Structure learning – focus on change in the network.
Apart from these, learning in ANN is classified into three
 categories as
  i) supervised learning
  ii) unsupervised learning
  Iii) reinforcement learning


                                                          28
SuperviSed learning
 Learning with the help of a teacher.
 Example : learning process of a small
  child.
     Child doesn’t know read/write.
     Their      each & every action is
       supervised by a teacher
 In ANN, each input vector requires a
  corresponding target vector, which
  represents the desired output.
 The input vector along with target vector
  is called training pair.
 The input vector results in output vector.
 The actual output vector is compared
  with desired output vector.
 If there is a difference means an error
  signal is generated by the network.
 It is used for adjustment of weights until
  actual output matches desired output.

                                               29
unSuperviSed learning
 Learning is performed without the help
  of a teacher.
 Example: tadpole – learn to swim by
  itself.
 In ANN, during training process,
  network receives input patterns and
  organize it to form clusters.
 From the Fig. it is observed that no
  feedback is applied from environment to
  inform what output should be or whether
  they are correct.
 The network itself discover patterns,
  regularities, features/ categories from the
  input data and relations for the input data
  over the output.
 Exact clusters are formed by discovering
  similarities & dissimilarities so called as
  self – organizing.


                                                30
reinForcement learning
            Similar to supervised learning.
            Learning     based      on    critic
             information          is        called
             reinforcement learning & the
             feedback       sent     is     called
             reinforcement signal.
            The network receives some
             feedback from the environment.
            Feedback is only evaluative.
            The external reinforcement signals
             are processed in the critic signal
             generator, and the obtained critic
             signals are sent to the ANN for
             adjustment of weights properly to
             get critic feedback in future.


                                                 31
activation FunctionS
 To make work more efficient and for exact output, some force or activation is given.
 Like that, activation function is applied over the net input to calculate the output of an
  ANN.
 Information processing of processing element has two major parts: input and output.
 An integration function (f) is associated with input of processing element.
 Several activation functions are there.
    1. Identity function:
          it is a linear function which is defined as
                       f(x) =x for all x
          The output is same as the input.
    2. Binary step function
          it is defined as




           where θ represents thresh hold value.
       It is used in single layer nets to convert the net input to an output that is bianary. ( 0 or 1(



                                                                                                          32
contd..
3. Bipolar step function:
•           It is defined as

•           where θ represents threshold value.
•           used in single layer nets to convert the net input to an output that is bipolar (+1 or -1).
4. Sigmoid function
    used in Back propagation nets.
    Two types:
            a) binary sigmoid function
                        -logistic sigmoid function or unipolar sigmoid function.
                        -it is defined as



                                   where λ – steepness parameter.
                        -The derivative of this function is
                                   f’(x) = λ f(x)[1-f(x)]. The range of sigmoid function is 0 to 1.       33
contd..
 b) Bipolar sigmoid function



                  where λ- steepness parameter and the sigmoid range is between -1
 and +1.
       - The derivative of this function can be
                                [1+f(x)][1-f(x)]
         - It is closely related to hyberbolic tangent function, which is written as




                                                                                       34
contd..
The derivative of the hyberbolic tangent function is
             h’(x)= [1+h(x))][1-h(x)]
5. Ramp function



The graphical representation of all these function is
 given in the upcoming Figure



                                                        35
36
Important termInologIes
Weight
   The weight contain information about the input signal.
   It is used by the net to solve the problem.
   It is represented in terms of matrix & called as connection matrix.
   If weight matrix W contains all the elements of an ANN, then the
    set of all W matrices will determine the set of all possible
    information processing configuration.
   The ANN can be realized by finding an appropriate matrix W.
   Weight encode long-term memory (LTM) and the activation
    states of network encode short-term memory (STM) in a neural
    network.

                                                                      37
Contd..
 Bias
    Bias has an impact in calculating net input.
    Bias is included by adding x0 to the input vector x.
    The net output is calculated by




    The bias is of two types
          Positive bias
             Increase the net input

          Negative bias

             Decrease the net input
                                                            38
Contd..
Threshold
  It is a set value based upon which the final output is
   calculated.
  Calculated net input and threshold is compared to get
   the network output.
  The activation function of threshold is defined as



       where θ is the fixed threshold value


                                                            39
Contd..
 Learning rate
    Denoted by α.
    Control the amount of weight adjustment at each step of training.
    The learning rate range from 0 to 1.
    Determine the rate of learning at each step
 Momentum Factor
    Convergence is made faster if a momentum factor is added to the weight
     updation process.
    Done in back propagation network.
 Vigilance parameter
    Denoted by ρ.
    Used in Adaptive Resonance Theory (ART) network.
    Used to control the degree of similarity.
    Ranges from 0.7 to 1 to perform useful work in controlling the number of
     clusters.

                                                                            40
mCCulloCh-pItts neuron
Discovered in 1943.
Usually called as M-P neuron.
M-P neurons are connected by directed weighted paths.
Activation of M-P neurons is binary (i.e) at any time step the
 neuron may fire or may not fire.
Weights associated with communication links may be
 excitatory(wgts are positive)/inhibitory(wgts are negative).
Threshold plays major role here. There is a fixed threshold for
 each neuron and if the net input to the neuron is greater than
 the threshold then the neuron fires.
They are widely used in logic functions.

                                                               41
Contd…
         A simple M-P neuron is shown
          in the figure.
         It is excitatory with weight
          (w>0) / inhibitory with weight
          –p (p<0).
         In the Fig., inputs from x1 to xn
          possess excitatory weighted
          connection and Xn+1 to xn+m has
          inhibitory             weighted
          interconnections.
         Since the firing of neuron is
          based        on       threshold,
          activation function is defined
          as

                                          42
Contd…
For inhibition to be absolute, the threshold with the activation
  function should satisfy the following condition:
                      θ >nw –p
Output will fire if it receives “k” or more excitatory inputs but
  no inhibitory inputs where
                       kw≥θ>(k-1) w
        - The M-P neuron has no particular training algorithm.
        - An analysis is performed to determine the weights and
  the threshold.
        - It is used as a building block where any function or
  phenomenon is modeled based on a logic function.

                                                                 43
lInear separabIlIty
It is a concept wherein the separation of the input space into
 regions is based on whether the network response is positive or
 negative.
A decision line is drawn to separate positive or negative
 response.
The decision line is also called as decision-making line or
 decision-support line or linear-separable line.
The net input calculation to the output unit is given as


The region which is called as decision boundary is determined
  by the relation

                                                               44
Contd..
 Consider      a network having
  positive response in the first
  quadrant and negative response in
  all other quadrants with either
  binary or bipolar data.
 Decision line is drawn separating
  two regions as shown in Fig.
 Using bipolar data representation,
  missing data can be distinguished
  from mistaken data. Hence bipolar
  data is better than binary data.
 Missing values are represented by
  0 and mistakes by reversing the
  input values from +1 to -1 or vice
  versa.


                                       45
hebb network
 Donald Hebb stated in 1949 that “ In brain, the learning is performed by the change
  in the synaptic gap”.
 When an axon of cell A is near enough to excite cell B, and repeatedly or
  permanently takes place in firing it, some growth process or metabolic change takes
  place in one or both the cells such than A’s efficiency, as one of the cells firing B, is
  increased.
 According to Hebb rule, the weight vector is found to increase proportionately to
  the product of the input and the learning signal.
 In Hebb learning, two interconnected neurons are ‘on’ simultaneously.
 The weight update in Hebb rule is given by
                 Wi(new) = wi (old)+ xi y.
 It is suited more for bipolar data.
 If binary data is used, the weight updation formula cannot distinguish two
   conditions namely
     A training pair in which an input unit is “on” and the target value is “off”
     A training pair in which both the input unit and the target value is “off”.



                                                                                          46
FlowChart oF traInIng
     algorIthm
            Steps:
                0: First initialize the weights.
                1: Steps 2-4 have to be performed for each input
                 training vector and target output pair, s:t
                2: Input activations are set. The activation function
                 for input layer is identity function.
                        X =S for i=1 to n
                             i  i

                3: Output activations are set.
                4: Weight adjustment and bias adjustments are
                   performed.
                    W (new) = w (old)+x y
                         i      i       i

                      b(new)=b(old)+y
                      

                In step 4, the weight updation formula can be
                 written in vector form as
                   w(new)=w(old)+y.

                Change in weight is expressed as
                   Δw=xy

                  Hence,
                  w(new)=w(old)+Δw
                  Hebb rule is used for pattern association,
                      pattern categorization, pattern classification
                      and over a range of other areas
                                                                   47

Mais conteúdo relacionado

Mais procurados

Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Mostafa G. M. Mostafa
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkKnoldus Inc.
 
Logics for non monotonic reasoning-ai
Logics for non monotonic reasoning-aiLogics for non monotonic reasoning-ai
Logics for non monotonic reasoning-aiShaishavShah8
 
Deep Learning With Neural Networks
Deep Learning With Neural NetworksDeep Learning With Neural Networks
Deep Learning With Neural NetworksAniket Maurya
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep LearningOswald Campesato
 
Machine Learning with Decision trees
Machine Learning with Decision treesMachine Learning with Decision trees
Machine Learning with Decision treesKnoldus Inc.
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesMohammed Bennamoun
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkDEEPASHRI HK
 
Genetic algorithm ppt
Genetic algorithm pptGenetic algorithm ppt
Genetic algorithm pptMayank Jain
 
Linear regression
Linear regressionLinear regression
Linear regressionMartinHogg9
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networksSi Haem
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPrakash K
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networksAkash Goel
 

Mais procurados (20)

Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
 
Soft computing
Soft computingSoft computing
Soft computing
 
Logics for non monotonic reasoning-ai
Logics for non monotonic reasoning-aiLogics for non monotonic reasoning-ai
Logics for non monotonic reasoning-ai
 
Deep Learning With Neural Networks
Deep Learning With Neural NetworksDeep Learning With Neural Networks
Deep Learning With Neural Networks
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
Machine Learning with Decision trees
Machine Learning with Decision treesMachine Learning with Decision trees
Machine Learning with Decision trees
 
Deep learning presentation
Deep learning presentationDeep learning presentation
Deep learning presentation
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Mc culloch pitts neuron
Mc culloch pitts neuronMc culloch pitts neuron
Mc culloch pitts neuron
 
Genetic algorithm ppt
Genetic algorithm pptGenetic algorithm ppt
Genetic algorithm ppt
 
Linear regression
Linear regressionLinear regression
Linear regression
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networks
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networks
 

Destaque

Soft computing (ANN and Fuzzy Logic) : Dr. Purnima Pandit
Soft computing (ANN and Fuzzy Logic)  : Dr. Purnima PanditSoft computing (ANN and Fuzzy Logic)  : Dr. Purnima Pandit
Soft computing (ANN and Fuzzy Logic) : Dr. Purnima PanditPurnima Pandit
 
Hebbian Learning
Hebbian LearningHebbian Learning
Hebbian LearningESCOM
 
Genetic Algorithms Made Easy
Genetic Algorithms Made EasyGenetic Algorithms Made Easy
Genetic Algorithms Made EasyPrakash Pimpale
 
An Introduction to Soft Computing
An Introduction to Soft ComputingAn Introduction to Soft Computing
An Introduction to Soft ComputingTameem Ahmad
 
Fuzzy logic application (aircraft landing)
Fuzzy logic application (aircraft landing)Fuzzy logic application (aircraft landing)
Fuzzy logic application (aircraft landing)Piyumal Samarathunga
 
Genetic Algorithm by Example
Genetic Algorithm by ExampleGenetic Algorithm by Example
Genetic Algorithm by ExampleNobal Niraula
 
Genetic algorithm
Genetic algorithmGenetic algorithm
Genetic algorithmgarima931
 
Chapter 5 - Fuzzy Logic
Chapter 5 - Fuzzy LogicChapter 5 - Fuzzy Logic
Chapter 5 - Fuzzy LogicAshique Rasool
 

Destaque (10)

Soft computing (ANN and Fuzzy Logic) : Dr. Purnima Pandit
Soft computing (ANN and Fuzzy Logic)  : Dr. Purnima PanditSoft computing (ANN and Fuzzy Logic)  : Dr. Purnima Pandit
Soft computing (ANN and Fuzzy Logic) : Dr. Purnima Pandit
 
Hebbian Learning
Hebbian LearningHebbian Learning
Hebbian Learning
 
Genetic Algorithms Made Easy
Genetic Algorithms Made EasyGenetic Algorithms Made Easy
Genetic Algorithms Made Easy
 
An Introduction to Soft Computing
An Introduction to Soft ComputingAn Introduction to Soft Computing
An Introduction to Soft Computing
 
Fuzzy logic application (aircraft landing)
Fuzzy logic application (aircraft landing)Fuzzy logic application (aircraft landing)
Fuzzy logic application (aircraft landing)
 
Genetic Algorithm by Example
Genetic Algorithm by ExampleGenetic Algorithm by Example
Genetic Algorithm by Example
 
Fuzzy logic ppt
Fuzzy logic pptFuzzy logic ppt
Fuzzy logic ppt
 
Genetic Algorithms
Genetic AlgorithmsGenetic Algorithms
Genetic Algorithms
 
Genetic algorithm
Genetic algorithmGenetic algorithm
Genetic algorithm
 
Chapter 5 - Fuzzy Logic
Chapter 5 - Fuzzy LogicChapter 5 - Fuzzy Logic
Chapter 5 - Fuzzy Logic
 

Semelhante a Unit I & II in Principles of Soft computing

Neural network
Neural network Neural network
Neural network Faireen
 
Seminar Neuro-computing
Seminar Neuro-computingSeminar Neuro-computing
Seminar Neuro-computingAniket Jadhao
 
Artificial Neural Network Paper Presentation
Artificial Neural Network Paper PresentationArtificial Neural Network Paper Presentation
Artificial Neural Network Paper Presentationguestac67362
 
Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02anandECE2010
 
Quantum neural network
Quantum neural networkQuantum neural network
Quantum neural networksurat murthy
 
Neural Network
Neural NetworkNeural Network
Neural NetworkSayyed Z
 
Neural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptxNeural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptxRatuRumana3
 
Nature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic WebNature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic Webguestecf0af
 
Neural Networks and Elixir
Neural Networks and ElixirNeural Networks and Elixir
Neural Networks and Elixirbgmarx
 

Semelhante a Unit I & II in Principles of Soft computing (20)

neural networks
neural networksneural networks
neural networks
 
Neural network
Neural network Neural network
Neural network
 
Seminar Neuro-computing
Seminar Neuro-computingSeminar Neuro-computing
Seminar Neuro-computing
 
Artificial Neural Network Paper Presentation
Artificial Neural Network Paper PresentationArtificial Neural Network Paper Presentation
Artificial Neural Network Paper Presentation
 
Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02
 
Quantum neural network
Quantum neural networkQuantum neural network
Quantum neural network
 
Neural network
Neural networkNeural network
Neural network
 
Lesson 37
Lesson 37Lesson 37
Lesson 37
 
AI Lesson 37
AI Lesson 37AI Lesson 37
AI Lesson 37
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Artificial Neural networks
Artificial Neural networksArtificial Neural networks
Artificial Neural networks
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
Neural network
Neural networkNeural network
Neural network
 
Neural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptxNeural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptx
 
Nature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic WebNature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic Web
 
ANN - UNIT 1.pptx
ANN - UNIT 1.pptxANN - UNIT 1.pptx
ANN - UNIT 1.pptx
 
Neural Networks and Elixir
Neural Networks and ElixirNeural Networks and Elixir
Neural Networks and Elixir
 
Neural networks
Neural networksNeural networks
Neural networks
 
Neural networks introduction
Neural networks introductionNeural networks introduction
Neural networks introduction
 
Unit+i
Unit+iUnit+i
Unit+i
 

Último

HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17Celine George
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomnelietumpap1
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 

Último (20)

HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choom
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 

Unit I & II in Principles of Soft computing

  • 1. -Neural network was inspired by the design and functioning of human brain and components. -Definition: -Information processing model that is inspired by the way biological nervous system (i.e) the brain, process information. -ANN is composed of large number of highly interconnected processing elements(neurons) working in unison to solve problems. -It is configured for special application such as pattern recognition and data classification through a learning process. -85-90% accurate. 1
  • 2. AdvAntAges of neurAl networks • A Neural Network can be an “expert” in analyzing the category of information given to it. • Answers “ what-if” questions • Adaptive learning – Ability to learn how to do tasks based on the data given for training or initial experience. • Self organization – Creates its own organization or representation of information it receives during learning time. • Real time operation – Computations can be carried out in parallel. • Fault tolerance via redundant information coding – Partial destruction of neural network cause degradation of performance. – In some cases, it can be retained even after major network damage. • In future, it can also used to give spoken words as instructions for machine. 2
  • 3. This figure shows the multi disciplinary point of view of Neural Networks 3
  • 4. ApplicAtion scope of neurAl networks  Air traffic control  Animal behavior  Appraisal and valuation of property, etc.,  Betting on horse races, stock markets  Criminal sentencing  Complex physical and chemical process  Data mining, cleaning and validation  Direct mail advertisers  Echo patterns  Economic modeling  Employee hiring  Expert consulatants  Fraud detection  Hand writing and typewriting  Lake water levels  Machinery controls  Medical diagnosis  Music composition  Photos and finger prints  Recipes and chemical formulation  Traffic flows  Weather prediction 4
  • 5. fuzzy logic  Lofti Zadeh, Professor at University of California.  An organized method for dealing with imprecise data  Fuzzy logic includes 0 and 1 as extreme cases of truth (or "the state of matters" or "fact") but also includes the various states of truth in between so that, for example, the result of a comparison between two things could be not "tall" or "short" but ".38 of tallness.“  Allows partial membership  Implemented in small, embedded micro controllers to large , networked, multichannel PC or work station.  Can be implemented in hardware, software or in both.  It mimics how a person would make decisions. 5
  • 6. genetic Algorithm  How genes of parents combine to form those of their children.  Create an initial population of individuals representing possible solutions to solve a problem  Individual characters determine whether they are less or more fit to the population  The more fit members will take high probability.  It is very effective in finding optimal or near optimal solutions.  Generate and test strategy.  Differ from normal optimization and search procedures in:  Work with coding of the parameter set  Work with multiple points  Search via sampling( a blind search)  Search using stochastic opeartors  In business, scientific and engineering circles, etc., 6
  • 7. hybrid system  Three types  Neuro Fuzzy hybrid system  Combination of fuzzy set theory and neural networks  Fuzzy system deal with explicit knowledge that can be explained and understood  Neural network deal with implicit knowledge acquired by learning  Advantages are:  Handle any kind of information  Manage imprecise, partial, vague or imperfect information  Resolve conflicts by collaboration and aggregation.  Self learning, self organizing and self tuning capability  No need for prior knowledge of relationship of data  Mimic human decision making system 7
  • 8. contd..  Neuron genetic hybrid system  Topology optimization  Select a topology for ANN , common one is back propagation  Genetic algorithm training  Learning of ANN is formulated ad weight optimization problem, usually mean squared error as a fitness measure  Control parameter optimization  Learning rate, momentum rate, tolerance level. Etc., are optimized using GA.  Fuzzy genetic hybrid system  Creating the classification rules for a fuzzy system where objects are classified by linguistic terms. 8
  • 9. soft computing  Two major problem solving techniques are:  Hard computing  Deals with precise model where accurate solutions are achieved.  Soft computing  deals with approximate model to give solution for complex problems  Prof. Lotfi Zadeh introduced it.  Ultimate goal-emulate the human mind  It is a combination of GA, Neural Network and FL. 9
  • 10. ArtificiAl neurAl network : An introduction Resembles the characteristic of biological neural network. Nodes – interconnected processing elements (units or neurons) Neuron is connected to other by a connection link. Each connection link is associated with weight which has information about the input signal. ANN processing elements are called as neurons or artificial neurons , since they have the capability to model networks of original neurons as found in brain. Internal state of neuron is called activation or activity level of neuron, which is the function of the inputs the neurons receives. Neuron can send only one signal at a time. 10
  • 11. bAsic operAtion of A neurAl net X1 and X2 – input neurons. Y- output neuron Weighted interconnection links- W1 and W2. Net input calculation is : Output is : Output= function 11
  • 12. contd… The function to be applied over the net input is called activation function. Weight involved in ANN is equal to the slope of linear straight line (y=mx). 12
  • 13. Biological Neural Network  Has three main parts  Soma or cell body-where cell nucleus is located  Dendrites-where the nerve is connected to the cell body  Axon-which carries the impulses of the neuron  Electric impulse is passed between synapse and dendrites.  Synapse- Axon split into strands and strands terminates into small bulb like organs called as synapse.  It is a chemical process which results in increase /decrease in the electric potential inside the body of the receiving cell.  If the electric potential reaches a thresh hold value, receiving cell fires & pulse / action potential of fixed strength and duration is send through the axon to synaptic junction of the cell.  After that, cell has to wait for a period called refractory period. 13
  • 14. coNtd.. In this model net input is calculated by 14
  • 15. Terminology Relation Between Biological And Artificial Neuron Biological Neuron Artificial Neuron Cell Neuron Dendrites Weights or interconnections Soma Net input Axon Output 15
  • 16. BraiN Vs computer Term Brain Computer Speed Execution time is few Execution time is few nano milliseconds seconds Processing Perform massive parallel Perform several parallel operations simultaneously operations simultaneously. It is faster the biological neuron Size and complexity Number of Neuron is 1011 and It depends on the chosen number of interconnections is application and network 1015. designer. So complexity of brain is higher than computer Storage capacity i) Information is stored in i) Stored in continuous interconnections or in synapse memory location. strength. ii) Overloading may destroy ii) New information is stored older locations. without destroying old one. iii) Can be easily retrieved iii) Sometimes fails to recollect information 16
  • 17. coNtd… Tolerance i) Fault tolerant i) No fault tolerance ii) Store and retrieve ii) Information information even corrupted if the interconnections network fails connections iii) Accept disconnected. redundancies iii) No redundancies Control mechanism Depends on active CPU chemicals and neuron Control mechanism is connections are strong or very simple weak 17
  • 18. characteristics of aNN:  Neurally implemented mathematical model  Large number of processing elements called neurons exists here.  Interconnections with weighted linkage hold informative knowledge.  Input signals arrive at processing elements through connections and connecting weights.  Processing elements can learn, recall and generalize from the given data.  Computational power is determined by the collective behavior of neurons.  ANN is a connection models, parallel distributed processing models, self- organizing systems, neuro-computing systems and neuro morphic system. 18
  • 19. eVolutioN of Neural Networks Year Neural network Designer Description 1943 McCulloch and Pitts McCulloch and Pitts Arrangement of neuron neurons is combination of logic gate. Unique feature is thresh hold 1949 Hebb network Hebb If two neurons are active, then their connection strengths should be increased. 1958,1959,1962,1988,1 Perceptron Frank Rosenblatt, Weights are adjusted 960 Adaline Block, Minsky and to reduce the Papert Widrow and difference between Hoff the net input to the output unit and the desired output 19
  • 20. coNtd… 1972 Kohonen self- Kohonen Inputs are organizing clustered to feature map obtain a fired output neuron. 1982, Hopfield network John Hopfield and Based on fixed 1984, Tank weights. 1985, Can act as 1986, associative 1987 memory nets 1986 Back propagation Rumelhart, i) Multilayered network Hinton and ii) Error Williams propagated backward from output to the hidden units 20
  • 21. coNtd.. 1988 Counter Grossberg Similar to propagation kohonen network network 1987-1990 Adaptive Carpenter and Designed for resonance Grossberg binary and analog Theory(ART) inputs. 1988 Radial basis Broomhead and Resemble back function network Lowe propagation network , but activation function used is Gaussian function 1988 Neo cognitron Fukushima For character recogniton. 21
  • 22. Basic models of aNN  Models are based on three entities  The model’s synaptic interconnections.  The training or learning rules adopted for updating and adjusting the connection weights.  Their activation functions  The arrangement of neurons to form layers and the connection pattern formed within and between layers is called the network architecture.  Five types:  Single layer feed forward network  Multilayer feed-forward network  Single node with its own feedback  Single-layer recurrent network  Multilayer recurrent network 22
  • 23. Single layer Feed- Forward Network Layer is formed by taking processing elements and combining it with other processing elements. Input and output are linked with each other Inputs are connected to the processing nodes with various weights, resulting in series of outputs one per node. 23
  • 24. Multilayer feed-forward network  Formed by the interconnection of several layers.  Input layer receives input and buffers input signal.  Output layer generated output.  Layer between input and output is called hidden layer.  Hidden layer is internal to the network.  Zero to several hidden layers in a network.  More the hidden layer, more is the complexity of network, but efficient output is produced. 24
  • 25. Feed back network  If no neuron in the output layer is an input to a node in the same layer / proceeding layer – feed forward network.  If outputs are directed back as input to the processing elements in the same layer/proceeding layer –feedback network.  If the output are directed back to the input of the same layer then it is lateral feedback.  Recurrent networks are networks with feedback networks with closed loop.  Fig 2.8 (A) –simple recurrent neural network having a single neuron with feedback to itself.  Fig 2.9 – single layer network with feedback from output can be directed to processing element itself or to other processing element/both. 25
  • 26.  Maxnet –competitive interconnections having fixed weights.  On-center-off-surround/lateral inhibiton structure – each processing neuron receives two different classes of inputs- “excitatory” input from nearby processing elements & “ inhibitory” elements from more distantly located precessing elements. This type of interconnection is shown below 26
  • 27. Processing element output can be directed back to the nodes in the preceding layer, forming a multilayer recurrent network. Processing element output can be directed to processing element itself or to other processing element in the same layer. 27
  • 28. learning Two broad kinds of learning in ANNs is : i) parameter learning – updates connecting weights in a neural net. ii) Structure learning – focus on change in the network. Apart from these, learning in ANN is classified into three categories as i) supervised learning ii) unsupervised learning Iii) reinforcement learning 28
  • 29. SuperviSed learning  Learning with the help of a teacher.  Example : learning process of a small child.  Child doesn’t know read/write.  Their each & every action is supervised by a teacher  In ANN, each input vector requires a corresponding target vector, which represents the desired output.  The input vector along with target vector is called training pair.  The input vector results in output vector.  The actual output vector is compared with desired output vector.  If there is a difference means an error signal is generated by the network.  It is used for adjustment of weights until actual output matches desired output. 29
  • 30. unSuperviSed learning  Learning is performed without the help of a teacher.  Example: tadpole – learn to swim by itself.  In ANN, during training process, network receives input patterns and organize it to form clusters.  From the Fig. it is observed that no feedback is applied from environment to inform what output should be or whether they are correct.  The network itself discover patterns, regularities, features/ categories from the input data and relations for the input data over the output.  Exact clusters are formed by discovering similarities & dissimilarities so called as self – organizing. 30
  • 31. reinForcement learning  Similar to supervised learning.  Learning based on critic information is called reinforcement learning & the feedback sent is called reinforcement signal.  The network receives some feedback from the environment.  Feedback is only evaluative.  The external reinforcement signals are processed in the critic signal generator, and the obtained critic signals are sent to the ANN for adjustment of weights properly to get critic feedback in future. 31
  • 32. activation FunctionS  To make work more efficient and for exact output, some force or activation is given.  Like that, activation function is applied over the net input to calculate the output of an ANN.  Information processing of processing element has two major parts: input and output.  An integration function (f) is associated with input of processing element.  Several activation functions are there. 1. Identity function:  it is a linear function which is defined as f(x) =x for all x  The output is same as the input. 2. Binary step function  it is defined as where θ represents thresh hold value. It is used in single layer nets to convert the net input to an output that is bianary. ( 0 or 1( 32
  • 33. contd.. 3. Bipolar step function: • It is defined as • where θ represents threshold value. • used in single layer nets to convert the net input to an output that is bipolar (+1 or -1). 4. Sigmoid function used in Back propagation nets. Two types: a) binary sigmoid function -logistic sigmoid function or unipolar sigmoid function. -it is defined as where λ – steepness parameter. -The derivative of this function is f’(x) = λ f(x)[1-f(x)]. The range of sigmoid function is 0 to 1. 33
  • 34. contd.. b) Bipolar sigmoid function where λ- steepness parameter and the sigmoid range is between -1 and +1. - The derivative of this function can be [1+f(x)][1-f(x)] - It is closely related to hyberbolic tangent function, which is written as 34
  • 35. contd.. The derivative of the hyberbolic tangent function is h’(x)= [1+h(x))][1-h(x)] 5. Ramp function The graphical representation of all these function is given in the upcoming Figure 35
  • 36. 36
  • 37. Important termInologIes Weight  The weight contain information about the input signal.  It is used by the net to solve the problem.  It is represented in terms of matrix & called as connection matrix.  If weight matrix W contains all the elements of an ANN, then the set of all W matrices will determine the set of all possible information processing configuration.  The ANN can be realized by finding an appropriate matrix W.  Weight encode long-term memory (LTM) and the activation states of network encode short-term memory (STM) in a neural network. 37
  • 38. Contd..  Bias  Bias has an impact in calculating net input.  Bias is included by adding x0 to the input vector x.  The net output is calculated by  The bias is of two types  Positive bias  Increase the net input  Negative bias  Decrease the net input 38
  • 39. Contd.. Threshold It is a set value based upon which the final output is calculated. Calculated net input and threshold is compared to get the network output. The activation function of threshold is defined as  where θ is the fixed threshold value 39
  • 40. Contd..  Learning rate  Denoted by α.  Control the amount of weight adjustment at each step of training.  The learning rate range from 0 to 1.  Determine the rate of learning at each step  Momentum Factor  Convergence is made faster if a momentum factor is added to the weight updation process.  Done in back propagation network.  Vigilance parameter  Denoted by ρ.  Used in Adaptive Resonance Theory (ART) network.  Used to control the degree of similarity.  Ranges from 0.7 to 1 to perform useful work in controlling the number of clusters. 40
  • 41. mCCulloCh-pItts neuron Discovered in 1943. Usually called as M-P neuron. M-P neurons are connected by directed weighted paths. Activation of M-P neurons is binary (i.e) at any time step the neuron may fire or may not fire. Weights associated with communication links may be excitatory(wgts are positive)/inhibitory(wgts are negative). Threshold plays major role here. There is a fixed threshold for each neuron and if the net input to the neuron is greater than the threshold then the neuron fires. They are widely used in logic functions. 41
  • 42. Contd… A simple M-P neuron is shown in the figure. It is excitatory with weight (w>0) / inhibitory with weight –p (p<0). In the Fig., inputs from x1 to xn possess excitatory weighted connection and Xn+1 to xn+m has inhibitory weighted interconnections. Since the firing of neuron is based on threshold, activation function is defined as 42
  • 43. Contd… For inhibition to be absolute, the threshold with the activation function should satisfy the following condition: θ >nw –p Output will fire if it receives “k” or more excitatory inputs but no inhibitory inputs where kw≥θ>(k-1) w - The M-P neuron has no particular training algorithm. - An analysis is performed to determine the weights and the threshold. - It is used as a building block where any function or phenomenon is modeled based on a logic function. 43
  • 44. lInear separabIlIty It is a concept wherein the separation of the input space into regions is based on whether the network response is positive or negative. A decision line is drawn to separate positive or negative response. The decision line is also called as decision-making line or decision-support line or linear-separable line. The net input calculation to the output unit is given as The region which is called as decision boundary is determined by the relation 44
  • 45. Contd..  Consider a network having positive response in the first quadrant and negative response in all other quadrants with either binary or bipolar data.  Decision line is drawn separating two regions as shown in Fig.  Using bipolar data representation, missing data can be distinguished from mistaken data. Hence bipolar data is better than binary data.  Missing values are represented by 0 and mistakes by reversing the input values from +1 to -1 or vice versa. 45
  • 46. hebb network  Donald Hebb stated in 1949 that “ In brain, the learning is performed by the change in the synaptic gap”.  When an axon of cell A is near enough to excite cell B, and repeatedly or permanently takes place in firing it, some growth process or metabolic change takes place in one or both the cells such than A’s efficiency, as one of the cells firing B, is increased.  According to Hebb rule, the weight vector is found to increase proportionately to the product of the input and the learning signal.  In Hebb learning, two interconnected neurons are ‘on’ simultaneously.  The weight update in Hebb rule is given by  Wi(new) = wi (old)+ xi y.  It is suited more for bipolar data.  If binary data is used, the weight updation formula cannot distinguish two conditions namely  A training pair in which an input unit is “on” and the target value is “off”  A training pair in which both the input unit and the target value is “off”. 46
  • 47. FlowChart oF traInIng algorIthm  Steps:  0: First initialize the weights.  1: Steps 2-4 have to be performed for each input training vector and target output pair, s:t  2: Input activations are set. The activation function for input layer is identity function.  X =S for i=1 to n i i  3: Output activations are set.  4: Weight adjustment and bias adjustments are performed.  W (new) = w (old)+x y i i i b(new)=b(old)+y   In step 4, the weight updation formula can be written in vector form as  w(new)=w(old)+y.  Change in weight is expressed as  Δw=xy Hence, w(new)=w(old)+Δw Hebb rule is used for pattern association, pattern categorization, pattern classification and over a range of other areas 47