SlideShare a Scribd company logo
1 of 51
Download to read offline
CHAPTER 00
INTRODUCTION
CSC445: Neural Networks
Prof. Dr. Mostafa Gadal-Haqq M. Mostafa
Computer Science Department
Faculty of Computer & Information Sciences
AIN SHAMS UNIVERSITY
(All figures in this presentation are copyrighted to Pearson Education, Inc.)
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 2
Course Objectives
 Introduce the main concepts and techniques of
neural network systems.
 Investigate the principal neural network
models and applications.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 3
Textbooks
 Recommended
 S. Haykin. Neural Networks and Learning Machines, 3ed.,
Prentice Hall (Pearson), 2009.
 Comprehensive and up-to-date
 L. Fausett. Fundamental of Neural Networks: Architectures,
Algorithms, and Applications. Prentice Hall, 1995.
 Intermediate
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 4
Course Assessment
 Assessment
 Homework, Quizzes, Computer Assignments (10 points)
 Midterm Exam (15 Points)
 Project + Lab Exam (10 Points)
 Final Exam (65 Points)
 Programming and homework assignments
 Late answers are NOT accepted!
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 5
Good Practice
 How to pass this course?
Don’t accumulate …
Do it yourself …
Ask for help …
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 6
Resources
 Webpages:
 The Neural Networks FAQs
 ftp://ftp.sas.com/pub/neural/FAQ.html
 The Neural Networks Resources
 http://www.neoxi.com/NNR/index.html
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
 What is a Neural Network?
 Benefits of Neural Networks !
 The Human Brain
 Models of A Neuron
 Neural Networks and Graphs
 Feedback
 Network Architectures
 Knowledge Representation
 Learning Processes
 Learning Tasks
7
Introduction
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 8
What is a Neural Network?
 A neural network is a massively parallel distributed
processor made up of simple processing units
(neurons) that has a natural tendency for storing
experiential knowledge and making it available for
use.
 It resembles the brain in two respects:
 Knowledge is acquired by the network from its environment
through the learning process.
 Interneuron connection strengths, known as synaptic weights,
are used to store the acquired knowledge.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 9
Benefits of Neural Networks
 Nonlinearity (NN could be linear or nonlinear)
 A highly important property, particularly if the underlying
physical mechanism responsible for generation of the input
signal (e.g. speech signal) is inherently nonlinear.
 Input-Output Mapping
 It finds the optimal mapping between an input signal and the
output results through learning mechanism that adjust the
weights that minimize the difference between the actual
response and the desired one. (nonparametric statistical
inference).
 Adaptivity
 A neural network could be designed to change its weights in real
time, which enables the system to operate in a nonstationary
environment.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 10
Benefits of Neural Networks
 Evidential Response
 In the context of pattern classification, a network can be
designed to provide information not only about which
particular pattern to select, but also about the confidence in the
decision made.
 Contextual Information
 Every neuron in the network is potentially affected by the global
activity of all other neurons. Consequently, contextual
information is dealt with naturally by a neural network.
 Fault tolerance
 A neural network is inherently fault tolerant, or capable of
robust computation, in the sense that its performance degrades
gracefully under adverse operating conditions.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 11
Benefits of Neural Networks
 VLSI Implementation
 The massively parallel nature of a neural network, makes it
potentially fast for a certain computation task, which also makes
it well suited for implementation using VLSI technology.
 Uniformity of Analysis and Design
 Neural networks enjoy universality as information processors:
 Neurons, in one form or another, represent an ingredient common
to all neural networks, which makes it possible to share theories and
learning algorithms in different applications.
 Modular networks can be built through a seamless integration of
modules
 Neurobiology Analogy
 The design of a neural network is motivated by analogy to the
brain (the fastest and powerful fault tolerant parallel processor).
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 12
Application of Neural Networks
 Neural networks are tractable and easy to
implement, Specially in hardware. This made it
attractive to be used in a wide range of applications:
 Pattern Classifications
 Medical Applications
 Forecasting
 Adaptive Filtering
 Adaptive Control
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Human Nervous System
 The human nervous system may be viewed as a
three-stage system:
 The brain, represented by the neural net, is central to the
system. It continually receive information, perceives it, and
makes appropriate decision.
 The receptors convert stimuli from the human body or the
external environment into electrical impulses that convey
information to the brain.
 The effectors convert electrical impulses generated by the brain
into distinct responses as system output
Figure 1 Block diagram representation of nervous system.
13
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 14
the Human Brain
 There are approximately 10 billion (1010) neurons in the
human cortex, compared with 10 of thousands (104)of
processors in the most powerful parallel computers.
 Each biological neuron is connected to several thousands of
other neurons.
 The typical operating speeds of biological neurons is
measured in milliseconds (10-3 s), while a silicon chip can
operate in nanoseconds (10-9 s). (Lack of processing units in
computers can be compensated by speed.)
 The human brain is extremely energy efficient, using
approximately 10-16 joules per operation per second, whereas
the best computers today use around 10-6 joules per
operation per second.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
The Natural Neural Cell
 The neuron’s cell body (soma)
processes the incoming activations
and converts them into output
activations.
 Dendrites are fibers which emanate
from the cell body and provide the
receptive zones that receive
activation from other neurons.
 Axons are fibers acting as
transmission lines that send
activation to other neurons.
 The junctions that allow signal
transmission between the axons and
dendrites are called synapses. The
process of transmission is by
diffusion of chemicals called
neurotransmitters across the
synaptic cleft
15
Figure 2 The pyramidal cell.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Organization of the Human Brain
 Neural Microcircuit
 An assembly of synapses organized into patterns of
connectivity to produce a functional operation of
interest.
 Local circuits
 Group of neurons with similar or different properties
that perform operations characteristic of a localized
region in the brain.
 Interregional circuits
 Pathways, columns, and topographic maps, which
involve multiple regions located in different parts of the
brain.
 Topographic maps are organized to respond to
incoming sensory information
16
Figure 3 Structural organization of levels in the brain.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Modularity of the Human Brain
17
Figure 4 Cytoarchitectural map of the cerebral cortex. The different areas are identified by
the thickness of their layers and types of cells within them. Some of the key sensory areas
are as follows: Motor cortex: motor strip, area 4; premotor area, area 6; frontal eye fields,
area 8. Somatosensory cortex: areas 3, 1, and 2. Visual cortex: areas 17, 18, and 19.
Auditory cortex: areas 41 and 42. (From A. Brodal, 1981; with permission of Oxford
University Press.)
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 18
Brief History of ANN
McCulloch and Pitts proposed the McCulloch-Pitts neuron model1943
Hebb published his book The Organization of Behavior, in which the Hebbian
learning rule was proposed.
1949
Rosenblatt introduced the simple single layer networks now called Perceptrons.1958
Minsky and Papert’s book Perceptrons demonstrated the limitation of single
layer perceptrons, and almost the whole field went into hibernation.
1969
Hopfield published a series of papers on Hopfield networks.1982
Kohonen developed the Self-Organising Maps that now bear his name.1982
The Back-Propagation learning algorithm for Multi-Layer Perceptrons was
rediscovered and the whole field took off again.
1986
The sub-field of Radial Basis Function Networks was developed.1990s
The power of Ensembles of Neural Networks and Support Vector Machines
becomes apparent.
2000s
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Models of a Neuron
 The artificial neuron is
made up of three basic
elements:
 A set of synapses, or
connecting links, each of
which is characterized by a
weight or strength of its own
 An adder for summing the
input signals, weighted by the
respective synaptic weight.
 An activation function for
limiting the amplitude of the
output of a neuron. (Also
called squashing function)
19
Figure 5 Nonlinear model of a
neuron, labeled k.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Models of a Neuron
 In mathematical terms:
 The output of the summing
function is the linear
combiner output:
 and the final output signal of
the neuron:
 (.) is the activation
function.
20
Figure 5 Nonlinear model of a
neuron, labeled k.



m
j
jkjk xwu
1
)( kkk buy 
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Effect of a Bias
 The use of a bias bk has the
effect of applying affine
transformation to the output uk.
 That is, the bias value changes
the relation between the induced
local field, or the activation
potential vk, and the linear
combiner uk as shown in Fig. 6.
21
Figure 6 Affine transformation
produced by the presence of a
bias; note that vk = bk at uk = 0.
kkk buv 
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Models of a Neuron
 Then we may write:
and
where
22
Figure 7 Another nonlinear model of a
neuron; wk0 accounts for the bias bk.



m
j
jkjk xwv
0
)( kk vy 
kk bwx  00 and,1
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Types of Activation Function
 Threshold function
also called Heaviside
function.
 This model is the
McCulloch and Pitts
neuron model.
23
Figure 8 (a) Threshold function.






0if0
0if1
)(
v
v
v
 That is, the neuron will has output signal only if its
activation potential is non-negative, a property
known as all-or-none.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Types of Activation Function
 Sigmoid function
 the most commonly used function. It is a strictly increasing
function that exhibit a graceful balance between linear and
nonlinear behavior.
 a is the slope parameter.
 This function is differentiable,
which is an important feature for the neural network theory.
24
Figure 8(b) Sigmoid function for
varying slope parameter a.
av
e
v 


1
1
)(
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Types of Activation Function
 Other activation functions
 the signum function, which is an odd function of its
activation potential v.
 The hyperbolic tangent function , which allows for an
odd sigmoid-type function.
25
)tanh()( vv 










0if
0if
0if
1
0
1
)(
v
v
v
v
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 26
Stochastic Model of a Neuron
 The previous neuron models are deterministic, that
is, its output is precisely known for any input signal.
 In some applications it is desirable to have a
stochastic neuron model.
 That is, the neuron is permitted to reside in only one
of two states; +1 and -1, say. The decision for a
neuron to fire (i.e., switch its state from “off” to
“on”) is probabilistic.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 27
Stochastic Model of a Neuron
 Let x denote the state of the neuron, and P(v)
denotes the probability of firing, where v is the
activation potential, then we may write:
 A standard choice of P(v)is the sigmoid function
 Where T is a parameter used to control the noise
level and therefore the uncertainty in firing. When
T 0, the model reduces to the deterministic model.
Tv
e
vP /
1
1
)( 








)P(-1yprobabilitwith
)P(yprobabilitwith
1
1
v
v
x
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Neural Networks Viewed as Directed Graph
 Figures 5 and 7 are functional description
of a neuron. We can use signal-flow graph
(a network of directed links that are
interconnected to points called nodes) to
describe the network using these rules:
 A signal flow along a link only in the direction defined
by the arrow on the link
 A node signal equal the algebraic sum of all signals
entering the node.
 The signal at a node is transmitted to each outgoing
link originated from that node.
28
Figure 9 illustrating basic rules for the construction of signal-flow graphs.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Neural Networks Viewed as Directed Graph
 A Signal-Flow graph of a neuron:
29
Figure 10 Signal-flow graph of a neuron.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Neural Networks Viewed as Directed Graph
 An Architectural graph of a neuron:
30
Figure 11 Architectural graph of a neuron.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Neural Networks Viewed as Directed Graph
 We have three graphical representations of a neural
network.
 Block diagram (Functional)
 Complete directed graph (Signal-Flow)
 Partially Complete directed graph (Architectural)
31
ArchitecturalSignal-flowFunctional
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Feedback
 Feedback is said to exist in dynamic systems
whenever the output of a node influences in part the
input of that particular node. (Very important in the
study of recurrent networks.
and
32
Figure 12 Signal-flow graph of a single-
loop feedback system.
)]([)( nxny jk  A
)]([)()( nynxnx kjj B
)]([
1
)( nxny jk
AB
A


ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Network Architectures
 Single-layer Feedforward
Networks
 Note that: we do not count the input
layer of the source nodes because no
computation is performed there.
33
Figure 15 Feedforward network
with a single layer of neurons.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Network Architectures
 Multilayer Feedforward Networks
 Input layer (source nodes)
 One or more hidden layers
 One output layer
 It is referred to as 10-4-2 network.
 Could be fully or partially connected.
34
Figure 16 Fully connected feedforward
network with one hidden layer and one
output layer.
 By adding one or more hidden
layers, the network is enabled to
extract higher order statistics from
its input).
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Network Architectures
 Recurrent Networks
 A recurrent neural networks
should has at least one
feedback loop
 Self-feedback refers to a
situation where the output
of a node is feedback into its
own input.
35
Figure 17 Recurrent network with no self-
feedback loops and no hidden neurons.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Network Architectures
 Recurrent Networks
 A recurrent neural
networks with self
feedback loop and with
hidden neurons.
36
Figure 18 Recurrent network with hidden neurons.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Knowledge Representation
 Knowledge refers to
 stored information or models used by a person or
machine to interpret, predict, and approximately respond
to the outside world.
 Characteristics of knowledge representation:
 What information is actually made explicit?
 How the information is physically encoded for subsequent use?
37
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Knowledge Representation
 The major task for a neural network is
 to learn a model of the world (environment) in which it is
embedded, and to maintain the model sufficiently consistently
with the real world so as to achieve the specific goals of the
application of interest.
 knowledge of real world consists of:
 the known world state represented by facts (prior information).
 Observations of the world obtained by sensors (measurements).
These observations are used to train the neural network.
38
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Knowledge Representation
 (Commonsense) Roles of Knowledge Representation
39
Figure 19 Illustrating the
relationship between inner product
and Euclidean distance as
measures of similarity between
patterns.
 Role 1: Similar inputs form similar
classes should usually produce
similar representation.
 Role 2: Items to be categorized as
separate classes should be given
widely different representation.
 Role 3: if a particular feature is
important, then there should be a
large number of neurons are
involved in the representation.
 Role 4: Prior information and
invariances should be built into the
design of a NN when they are
available.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Knowledge Representation
 How to Build prior Info into Neural Network Design?
40
Figure 20: Illustrating the combined use of a receptive field and weight sharing. All four hidden
neurons share the same set of weights exactly for their six synaptic connections.
 Currently ,there are no well-defined
rules. But rather some ad hoc
procedures:
 Restricting the network
architecture, which is achieved
through the use of local
connections (receptive fields).
 Constraining the choice of
synaptic weights, which is
implemented through the use of
weight-sharing.
 NNs that make use of these two
techniques are called Convolutional
Networks.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Knowledge Representation
 How to Build Invariance into Neural Network Design?
 Invariance by Structure
 For example, enforcing in-plain rotation by making wij = wji
 Invariance by Training
 Including different examples of each class.
 Invariant Feature Space
 Transforming the data to invariant feature space (Fig. 21).
41
Figure 21 Block diagram of an invariant-feature-space type of system.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 42
Neural Networks vs. Pattern Classifiers
 Pattern Classifiers: We should have a prior
knowledge about the data to be able to explicitly
model it.
 Neural Networks: The data set speaks about itself.
The NN learns the implicit model in the dat.
Data
(Explicit)
Model
Building
Training
Data Training
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Processes
 Supervised Learning: Learning with a Teacher
43
Figure 24 Block diagram of learning with a teacher; the part of the
figure printed in red constitutes a feedback loop.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Processes
 Unsupervised Learning: Learning without a Teacher
44
Figure 26 Block diagram of unsupervised learning.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Processes
 Reinforcement Learning: Learning without a Teacher
45
Figure 25 Block diagram of reinforcement learning; the learning
system and the environment are both inside the feedback loop.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Tasks
 Pattern Association
46
Figure 27 Input–output relation of pattern
associator.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Tasks
 Pattern Recognition
47
Figure 28 Illustration of the classical approach to pattern classification.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Tasks
 Function Approximation: System Identification
48
Figure 29 Block diagram of system identification:
The neural network, doing the identification, is
part of the feedback loop.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Tasks
 Function Approximation: Inverse Modeling
49
Figure 30 Block diagram of inverse system modeling. The neural network,
acting as the inverse model, is part of the feedback loop.
ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq
Learning Tasks
 Control
50
Figure 31 Block diagram of feedback control system.
Rosenblatt’s Perceptron
Next Time
51

More Related Content

What's hot

Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 
Neural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionNeural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionMostafa G. M. Mostafa
 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural NetworksDatabricks
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
Vc dimension in Machine Learning
Vc dimension in Machine LearningVc dimension in Machine Learning
Vc dimension in Machine LearningVARUN KUMAR
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural NetworksDean Wyatte
 
Classification using back propagation algorithm
Classification using back propagation algorithmClassification using back propagation algorithm
Classification using back propagation algorithmKIRAN R
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersMohammed Bennamoun
 
Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Mostafa G. M. Mostafa
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkAtul Krishna
 
L03 ai - knowledge representation using logic
L03 ai - knowledge representation using logicL03 ai - knowledge representation using logic
L03 ai - knowledge representation using logicManjula V
 

What's hot (20)

Neural network
Neural networkNeural network
Neural network
 
Back propagation
Back propagationBack propagation
Back propagation
 
Self-organizing map
Self-organizing mapSelf-organizing map
Self-organizing map
 
Mc culloch pitts neuron
Mc culloch pitts neuronMc culloch pitts neuron
Mc culloch pitts neuron
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 
Neural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionNeural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear Regression
 
Introduction to Neural Networks
Introduction to Neural NetworksIntroduction to Neural Networks
Introduction to Neural Networks
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Vc dimension in Machine Learning
Vc dimension in Machine LearningVc dimension in Machine Learning
Vc dimension in Machine Learning
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural Networks
 
Classification using back propagation algorithm
Classification using back propagation algorithmClassification using back propagation algorithm
Classification using back propagation algorithm
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
 
Ch 6 final
Ch 6 finalCh 6 final
Ch 6 final
 
MOBILE Ad-Hoc NETWORK (MANET)
MOBILE Ad-Hoc NETWORK (MANET)MOBILE Ad-Hoc NETWORK (MANET)
MOBILE Ad-Hoc NETWORK (MANET)
 
Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
L03 ai - knowledge representation using logic
L03 ai - knowledge representation using logicL03 ai - knowledge representation using logic
L03 ai - knowledge representation using logic
 

Viewers also liked

Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmMostafa G. M. Mostafa
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machinesMostafa G. M. Mostafa
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Mostafa G. M. Mostafa
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)Mostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainDigital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainMostafa G. M. Mostafa
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationMostafa G. M. Mostafa
 
Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Syed Atif Naseem
 
FPGA_Overview_Ibr_2014
FPGA_Overview_Ibr_2014FPGA_Overview_Ibr_2014
FPGA_Overview_Ibr_2014Ibrahim Hejab
 
Learning Convolutional Neural Networks for Graphs
Learning Convolutional Neural Networks for GraphsLearning Convolutional Neural Networks for Graphs
Learning Convolutional Neural Networks for GraphsMathias Niepert
 

Viewers also liked (20)

Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) Algorithm
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machines
 
Csc446: Pattern Recognition
Csc446: Pattern Recognition Csc446: Pattern Recognition
Csc446: Pattern Recognition
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)
 
CSC446: Pattern Recognition (LN6)
CSC446: Pattern Recognition (LN6)CSC446: Pattern Recognition (LN6)
CSC446: Pattern Recognition (LN6)
 
Csc446: Pattren Recognition (LN2)
Csc446: Pattren Recognition (LN2)Csc446: Pattren Recognition (LN2)
Csc446: Pattren Recognition (LN2)
 
CSC446: Pattern Recognition (LN5)
CSC446: Pattern Recognition (LN5)CSC446: Pattern Recognition (LN5)
CSC446: Pattern Recognition (LN5)
 
CSC446: Pattern Recognition (LN7)
CSC446: Pattern Recognition (LN7)CSC446: Pattern Recognition (LN7)
CSC446: Pattern Recognition (LN7)
 
CSC446: Pattern Recognition (LN4)
CSC446: Pattern Recognition (LN4)CSC446: Pattern Recognition (LN4)
CSC446: Pattern Recognition (LN4)
 
CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainDigital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency Domain
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image Segmentation
 
Csc446: Pattren Recognition (LN1)
Csc446: Pattren Recognition (LN1)Csc446: Pattren Recognition (LN1)
Csc446: Pattren Recognition (LN1)
 
Csc446: Pattren Recognition
Csc446: Pattren RecognitionCsc446: Pattren Recognition
Csc446: Pattren Recognition
 
Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Statistical Pattern recognition(1)
Statistical Pattern recognition(1)
 
FPGA_Overview_Ibr_2014
FPGA_Overview_Ibr_2014FPGA_Overview_Ibr_2014
FPGA_Overview_Ibr_2014
 
CSC446: Pattern Recognition (LN3)
CSC446: Pattern Recognition (LN3)CSC446: Pattern Recognition (LN3)
CSC446: Pattern Recognition (LN3)
 
Learning Convolutional Neural Networks for Graphs
Learning Convolutional Neural Networks for GraphsLearning Convolutional Neural Networks for Graphs
Learning Convolutional Neural Networks for Graphs
 
Self Organizing Maps
Self Organizing MapsSelf Organizing Maps
Self Organizing Maps
 

Similar to Neural Networks: Introducton

Introduction to Neural networks (under graduate course) Lecture 1 of 9
Introduction to Neural networks (under graduate course) Lecture 1 of 9Introduction to Neural networks (under graduate course) Lecture 1 of 9
Introduction to Neural networks (under graduate course) Lecture 1 of 9Randa Elanwar
 
Artificial Neural Network An Important Asset For Future Computing
Artificial Neural Network   An Important Asset For Future ComputingArtificial Neural Network   An Important Asset For Future Computing
Artificial Neural Network An Important Asset For Future ComputingBria Davis
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdfBria Davis
 
Intro to neural networks and fuzzy systems
Intro to neural networks and fuzzy systemsIntro to neural networks and fuzzy systems
Intro to neural networks and fuzzy systemsAmro56
 
Neural Network
Neural NetworkNeural Network
Neural NetworkSayyed Z
 
Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applicationsshritosh kumar
 
Artificial nueral network slideshare
Artificial nueral network slideshareArtificial nueral network slideshare
Artificial nueral network slideshareRed Innovators
 
Artificial Neural Network report
Artificial Neural Network reportArtificial Neural Network report
Artificial Neural Network reportAnjali Agrawal
 
fundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausettfundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausettZarnigar Altaf
 
ML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkf
ML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkfML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkf
ML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkfmamathamyakaojaiah62
 

Similar to Neural Networks: Introducton (20)

Introduction to Neural Network
Introduction to Neural NetworkIntroduction to Neural Network
Introduction to Neural Network
 
Introduction to Neural networks (under graduate course) Lecture 1 of 9
Introduction to Neural networks (under graduate course) Lecture 1 of 9Introduction to Neural networks (under graduate course) Lecture 1 of 9
Introduction to Neural networks (under graduate course) Lecture 1 of 9
 
Artificial Neural Network An Important Asset For Future Computing
Artificial Neural Network   An Important Asset For Future ComputingArtificial Neural Network   An Important Asset For Future Computing
Artificial Neural Network An Important Asset For Future Computing
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdf
 
Intro to neural networks and fuzzy systems
Intro to neural networks and fuzzy systemsIntro to neural networks and fuzzy systems
Intro to neural networks and fuzzy systems
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applications
 
ANN - UNIT 1.pptx
ANN - UNIT 1.pptxANN - UNIT 1.pptx
ANN - UNIT 1.pptx
 
Artificial nueral network slideshare
Artificial nueral network slideshareArtificial nueral network slideshare
Artificial nueral network slideshare
 
Lec 1-2-3-intr.
Lec 1-2-3-intr.Lec 1-2-3-intr.
Lec 1-2-3-intr.
 
Introduction_NNFL_Aug2022.pdf
Introduction_NNFL_Aug2022.pdfIntroduction_NNFL_Aug2022.pdf
Introduction_NNFL_Aug2022.pdf
 
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
 
08 neural networks(1).unlocked
08 neural networks(1).unlocked08 neural networks(1).unlocked
08 neural networks(1).unlocked
 
Artificial Neural Network report
Artificial Neural Network reportArtificial Neural Network report
Artificial Neural Network report
 
Neural network
Neural networkNeural network
Neural network
 
Neural networks
Neural networksNeural networks
Neural networks
 
AI Lesson 37
AI Lesson 37AI Lesson 37
AI Lesson 37
 
Lesson 37
Lesson 37Lesson 37
Lesson 37
 
fundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausettfundamentals-of-neural-networks-laurene-fausett
fundamentals-of-neural-networks-laurene-fausett
 
ML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkf
ML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkfML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkf
ML UNIT2.pptx uyftdhfjkghnkgutdmsedjytkf
 

Recently uploaded

Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseAnaAcapella
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...pradhanghanshyam7136
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...ZurliaSoop
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Association for Project Management
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701bronxfugly43
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
Third Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxThird Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxAmita Gupta
 

Recently uploaded (20)

Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Spatium Project Simulation student brief
Spatium Project Simulation student briefSpatium Project Simulation student brief
Spatium Project Simulation student brief
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Third Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxThird Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptx
 

Neural Networks: Introducton

  • 1. CHAPTER 00 INTRODUCTION CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq M. Mostafa Computer Science Department Faculty of Computer & Information Sciences AIN SHAMS UNIVERSITY (All figures in this presentation are copyrighted to Pearson Education, Inc.)
  • 2. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 2 Course Objectives  Introduce the main concepts and techniques of neural network systems.  Investigate the principal neural network models and applications.
  • 3. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 3 Textbooks  Recommended  S. Haykin. Neural Networks and Learning Machines, 3ed., Prentice Hall (Pearson), 2009.  Comprehensive and up-to-date  L. Fausett. Fundamental of Neural Networks: Architectures, Algorithms, and Applications. Prentice Hall, 1995.  Intermediate
  • 4. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 4 Course Assessment  Assessment  Homework, Quizzes, Computer Assignments (10 points)  Midterm Exam (15 Points)  Project + Lab Exam (10 Points)  Final Exam (65 Points)  Programming and homework assignments  Late answers are NOT accepted!
  • 5. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 5 Good Practice  How to pass this course? Don’t accumulate … Do it yourself … Ask for help …
  • 6. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 6 Resources  Webpages:  The Neural Networks FAQs  ftp://ftp.sas.com/pub/neural/FAQ.html  The Neural Networks Resources  http://www.neoxi.com/NNR/index.html
  • 7. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq  What is a Neural Network?  Benefits of Neural Networks !  The Human Brain  Models of A Neuron  Neural Networks and Graphs  Feedback  Network Architectures  Knowledge Representation  Learning Processes  Learning Tasks 7 Introduction
  • 8. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 8 What is a Neural Network?  A neural network is a massively parallel distributed processor made up of simple processing units (neurons) that has a natural tendency for storing experiential knowledge and making it available for use.  It resembles the brain in two respects:  Knowledge is acquired by the network from its environment through the learning process.  Interneuron connection strengths, known as synaptic weights, are used to store the acquired knowledge.
  • 9. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 9 Benefits of Neural Networks  Nonlinearity (NN could be linear or nonlinear)  A highly important property, particularly if the underlying physical mechanism responsible for generation of the input signal (e.g. speech signal) is inherently nonlinear.  Input-Output Mapping  It finds the optimal mapping between an input signal and the output results through learning mechanism that adjust the weights that minimize the difference between the actual response and the desired one. (nonparametric statistical inference).  Adaptivity  A neural network could be designed to change its weights in real time, which enables the system to operate in a nonstationary environment.
  • 10. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 10 Benefits of Neural Networks  Evidential Response  In the context of pattern classification, a network can be designed to provide information not only about which particular pattern to select, but also about the confidence in the decision made.  Contextual Information  Every neuron in the network is potentially affected by the global activity of all other neurons. Consequently, contextual information is dealt with naturally by a neural network.  Fault tolerance  A neural network is inherently fault tolerant, or capable of robust computation, in the sense that its performance degrades gracefully under adverse operating conditions.
  • 11. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 11 Benefits of Neural Networks  VLSI Implementation  The massively parallel nature of a neural network, makes it potentially fast for a certain computation task, which also makes it well suited for implementation using VLSI technology.  Uniformity of Analysis and Design  Neural networks enjoy universality as information processors:  Neurons, in one form or another, represent an ingredient common to all neural networks, which makes it possible to share theories and learning algorithms in different applications.  Modular networks can be built through a seamless integration of modules  Neurobiology Analogy  The design of a neural network is motivated by analogy to the brain (the fastest and powerful fault tolerant parallel processor).
  • 12. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 12 Application of Neural Networks  Neural networks are tractable and easy to implement, Specially in hardware. This made it attractive to be used in a wide range of applications:  Pattern Classifications  Medical Applications  Forecasting  Adaptive Filtering  Adaptive Control
  • 13. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Human Nervous System  The human nervous system may be viewed as a three-stage system:  The brain, represented by the neural net, is central to the system. It continually receive information, perceives it, and makes appropriate decision.  The receptors convert stimuli from the human body or the external environment into electrical impulses that convey information to the brain.  The effectors convert electrical impulses generated by the brain into distinct responses as system output Figure 1 Block diagram representation of nervous system. 13
  • 14. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 14 the Human Brain  There are approximately 10 billion (1010) neurons in the human cortex, compared with 10 of thousands (104)of processors in the most powerful parallel computers.  Each biological neuron is connected to several thousands of other neurons.  The typical operating speeds of biological neurons is measured in milliseconds (10-3 s), while a silicon chip can operate in nanoseconds (10-9 s). (Lack of processing units in computers can be compensated by speed.)  The human brain is extremely energy efficient, using approximately 10-16 joules per operation per second, whereas the best computers today use around 10-6 joules per operation per second.
  • 15. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq The Natural Neural Cell  The neuron’s cell body (soma) processes the incoming activations and converts them into output activations.  Dendrites are fibers which emanate from the cell body and provide the receptive zones that receive activation from other neurons.  Axons are fibers acting as transmission lines that send activation to other neurons.  The junctions that allow signal transmission between the axons and dendrites are called synapses. The process of transmission is by diffusion of chemicals called neurotransmitters across the synaptic cleft 15 Figure 2 The pyramidal cell.
  • 16. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Organization of the Human Brain  Neural Microcircuit  An assembly of synapses organized into patterns of connectivity to produce a functional operation of interest.  Local circuits  Group of neurons with similar or different properties that perform operations characteristic of a localized region in the brain.  Interregional circuits  Pathways, columns, and topographic maps, which involve multiple regions located in different parts of the brain.  Topographic maps are organized to respond to incoming sensory information 16 Figure 3 Structural organization of levels in the brain.
  • 17. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Modularity of the Human Brain 17 Figure 4 Cytoarchitectural map of the cerebral cortex. The different areas are identified by the thickness of their layers and types of cells within them. Some of the key sensory areas are as follows: Motor cortex: motor strip, area 4; premotor area, area 6; frontal eye fields, area 8. Somatosensory cortex: areas 3, 1, and 2. Visual cortex: areas 17, 18, and 19. Auditory cortex: areas 41 and 42. (From A. Brodal, 1981; with permission of Oxford University Press.)
  • 18. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 18 Brief History of ANN McCulloch and Pitts proposed the McCulloch-Pitts neuron model1943 Hebb published his book The Organization of Behavior, in which the Hebbian learning rule was proposed. 1949 Rosenblatt introduced the simple single layer networks now called Perceptrons.1958 Minsky and Papert’s book Perceptrons demonstrated the limitation of single layer perceptrons, and almost the whole field went into hibernation. 1969 Hopfield published a series of papers on Hopfield networks.1982 Kohonen developed the Self-Organising Maps that now bear his name.1982 The Back-Propagation learning algorithm for Multi-Layer Perceptrons was rediscovered and the whole field took off again. 1986 The sub-field of Radial Basis Function Networks was developed.1990s The power of Ensembles of Neural Networks and Support Vector Machines becomes apparent. 2000s
  • 19. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Models of a Neuron  The artificial neuron is made up of three basic elements:  A set of synapses, or connecting links, each of which is characterized by a weight or strength of its own  An adder for summing the input signals, weighted by the respective synaptic weight.  An activation function for limiting the amplitude of the output of a neuron. (Also called squashing function) 19 Figure 5 Nonlinear model of a neuron, labeled k.
  • 20. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Models of a Neuron  In mathematical terms:  The output of the summing function is the linear combiner output:  and the final output signal of the neuron:  (.) is the activation function. 20 Figure 5 Nonlinear model of a neuron, labeled k.    m j jkjk xwu 1 )( kkk buy 
  • 21. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Effect of a Bias  The use of a bias bk has the effect of applying affine transformation to the output uk.  That is, the bias value changes the relation between the induced local field, or the activation potential vk, and the linear combiner uk as shown in Fig. 6. 21 Figure 6 Affine transformation produced by the presence of a bias; note that vk = bk at uk = 0. kkk buv 
  • 22. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Models of a Neuron  Then we may write: and where 22 Figure 7 Another nonlinear model of a neuron; wk0 accounts for the bias bk.    m j jkjk xwv 0 )( kk vy  kk bwx  00 and,1
  • 23. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Types of Activation Function  Threshold function also called Heaviside function.  This model is the McCulloch and Pitts neuron model. 23 Figure 8 (a) Threshold function.       0if0 0if1 )( v v v  That is, the neuron will has output signal only if its activation potential is non-negative, a property known as all-or-none.
  • 24. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Types of Activation Function  Sigmoid function  the most commonly used function. It is a strictly increasing function that exhibit a graceful balance between linear and nonlinear behavior.  a is the slope parameter.  This function is differentiable, which is an important feature for the neural network theory. 24 Figure 8(b) Sigmoid function for varying slope parameter a. av e v    1 1 )(
  • 25. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Types of Activation Function  Other activation functions  the signum function, which is an odd function of its activation potential v.  The hyperbolic tangent function , which allows for an odd sigmoid-type function. 25 )tanh()( vv            0if 0if 0if 1 0 1 )( v v v v
  • 26. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 26 Stochastic Model of a Neuron  The previous neuron models are deterministic, that is, its output is precisely known for any input signal.  In some applications it is desirable to have a stochastic neuron model.  That is, the neuron is permitted to reside in only one of two states; +1 and -1, say. The decision for a neuron to fire (i.e., switch its state from “off” to “on”) is probabilistic.
  • 27. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 27 Stochastic Model of a Neuron  Let x denote the state of the neuron, and P(v) denotes the probability of firing, where v is the activation potential, then we may write:  A standard choice of P(v)is the sigmoid function  Where T is a parameter used to control the noise level and therefore the uncertainty in firing. When T 0, the model reduces to the deterministic model. Tv e vP / 1 1 )(          )P(-1yprobabilitwith )P(yprobabilitwith 1 1 v v x
  • 28. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Neural Networks Viewed as Directed Graph  Figures 5 and 7 are functional description of a neuron. We can use signal-flow graph (a network of directed links that are interconnected to points called nodes) to describe the network using these rules:  A signal flow along a link only in the direction defined by the arrow on the link  A node signal equal the algebraic sum of all signals entering the node.  The signal at a node is transmitted to each outgoing link originated from that node. 28 Figure 9 illustrating basic rules for the construction of signal-flow graphs.
  • 29. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Neural Networks Viewed as Directed Graph  A Signal-Flow graph of a neuron: 29 Figure 10 Signal-flow graph of a neuron.
  • 30. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Neural Networks Viewed as Directed Graph  An Architectural graph of a neuron: 30 Figure 11 Architectural graph of a neuron.
  • 31. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Neural Networks Viewed as Directed Graph  We have three graphical representations of a neural network.  Block diagram (Functional)  Complete directed graph (Signal-Flow)  Partially Complete directed graph (Architectural) 31 ArchitecturalSignal-flowFunctional
  • 32. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Feedback  Feedback is said to exist in dynamic systems whenever the output of a node influences in part the input of that particular node. (Very important in the study of recurrent networks. and 32 Figure 12 Signal-flow graph of a single- loop feedback system. )]([)( nxny jk  A )]([)()( nynxnx kjj B )]([ 1 )( nxny jk AB A  
  • 33. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Network Architectures  Single-layer Feedforward Networks  Note that: we do not count the input layer of the source nodes because no computation is performed there. 33 Figure 15 Feedforward network with a single layer of neurons.
  • 34. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Network Architectures  Multilayer Feedforward Networks  Input layer (source nodes)  One or more hidden layers  One output layer  It is referred to as 10-4-2 network.  Could be fully or partially connected. 34 Figure 16 Fully connected feedforward network with one hidden layer and one output layer.  By adding one or more hidden layers, the network is enabled to extract higher order statistics from its input).
  • 35. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Network Architectures  Recurrent Networks  A recurrent neural networks should has at least one feedback loop  Self-feedback refers to a situation where the output of a node is feedback into its own input. 35 Figure 17 Recurrent network with no self- feedback loops and no hidden neurons.
  • 36. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Network Architectures  Recurrent Networks  A recurrent neural networks with self feedback loop and with hidden neurons. 36 Figure 18 Recurrent network with hidden neurons.
  • 37. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Knowledge Representation  Knowledge refers to  stored information or models used by a person or machine to interpret, predict, and approximately respond to the outside world.  Characteristics of knowledge representation:  What information is actually made explicit?  How the information is physically encoded for subsequent use? 37
  • 38. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Knowledge Representation  The major task for a neural network is  to learn a model of the world (environment) in which it is embedded, and to maintain the model sufficiently consistently with the real world so as to achieve the specific goals of the application of interest.  knowledge of real world consists of:  the known world state represented by facts (prior information).  Observations of the world obtained by sensors (measurements). These observations are used to train the neural network. 38
  • 39. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Knowledge Representation  (Commonsense) Roles of Knowledge Representation 39 Figure 19 Illustrating the relationship between inner product and Euclidean distance as measures of similarity between patterns.  Role 1: Similar inputs form similar classes should usually produce similar representation.  Role 2: Items to be categorized as separate classes should be given widely different representation.  Role 3: if a particular feature is important, then there should be a large number of neurons are involved in the representation.  Role 4: Prior information and invariances should be built into the design of a NN when they are available.
  • 40. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Knowledge Representation  How to Build prior Info into Neural Network Design? 40 Figure 20: Illustrating the combined use of a receptive field and weight sharing. All four hidden neurons share the same set of weights exactly for their six synaptic connections.  Currently ,there are no well-defined rules. But rather some ad hoc procedures:  Restricting the network architecture, which is achieved through the use of local connections (receptive fields).  Constraining the choice of synaptic weights, which is implemented through the use of weight-sharing.  NNs that make use of these two techniques are called Convolutional Networks.
  • 41. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Knowledge Representation  How to Build Invariance into Neural Network Design?  Invariance by Structure  For example, enforcing in-plain rotation by making wij = wji  Invariance by Training  Including different examples of each class.  Invariant Feature Space  Transforming the data to invariant feature space (Fig. 21). 41 Figure 21 Block diagram of an invariant-feature-space type of system.
  • 42. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq 42 Neural Networks vs. Pattern Classifiers  Pattern Classifiers: We should have a prior knowledge about the data to be able to explicitly model it.  Neural Networks: The data set speaks about itself. The NN learns the implicit model in the dat. Data (Explicit) Model Building Training Data Training
  • 43. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Processes  Supervised Learning: Learning with a Teacher 43 Figure 24 Block diagram of learning with a teacher; the part of the figure printed in red constitutes a feedback loop.
  • 44. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Processes  Unsupervised Learning: Learning without a Teacher 44 Figure 26 Block diagram of unsupervised learning.
  • 45. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Processes  Reinforcement Learning: Learning without a Teacher 45 Figure 25 Block diagram of reinforcement learning; the learning system and the environment are both inside the feedback loop.
  • 46. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Tasks  Pattern Association 46 Figure 27 Input–output relation of pattern associator.
  • 47. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Tasks  Pattern Recognition 47 Figure 28 Illustration of the classical approach to pattern classification.
  • 48. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Tasks  Function Approximation: System Identification 48 Figure 29 Block diagram of system identification: The neural network, doing the identification, is part of the feedback loop.
  • 49. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Tasks  Function Approximation: Inverse Modeling 49 Figure 30 Block diagram of inverse system modeling. The neural network, acting as the inverse model, is part of the feedback loop.
  • 50. ASU-CSC445: Neural Networks Prof. Dr. Mostafa Gadal-Haqq Learning Tasks  Control 50 Figure 31 Block diagram of feedback control system.