SlideShare uma empresa Scribd logo
1 de 29
Baixar para ler offline
Bayesian Decision Theory
Prof. Dr. Mostafa Gadal-Haqq
Faculty of Computer & Information Sciences
Computer Science Department
AIN SHAMS UNIVERSITY
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 1
CSC446 : Pattern Recognition
(Pattern Classifications, Ch2: Sec. 2.1 to Sec. 2.3)
2.1 Bayesian Decision Theory
• Bayesian Decision Theory is based on
quantifying the trade-offs between various
classification decisions using probabilities
and the costs that accompany such decisions.
• Assumes that: The decision problem is posed
in probabilistic terms and that all of the
relevant probability values are known.
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 2
2.1 Bayesian Decision Theory
• Back to the Fish Sorting Machine:
–  = a random variable (State of nature)={1 ,2}
• For example: 1 = Sea bass, and 2 = Salmon
• P(1 ) = the prior (a priori probability) that the
coming fish is sea bass.
• P(2 ) = the prior (a priori probability) that the
coming fish is salmon.
– The priors gives us the knowledge of how likely
we are to get salmon or Sea bass before the fish
actually appears.
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 3
• Decision Rule Using Priors only:
– to make a decision about the fish that will
appear using only the priors, P(1) and P(2),
We use the following decision rule:
– which minimize the error.
2.1 Bayesian Decision Theory
Decide fish 1 if P(1) > P(2)
and fish 2 if P(1) < P(2)
Probability of error = min [ P(1) , P(2)]
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 4
• That is:
– If P(1) >> P(2) we will be right most of the
time when we decide that the fish belong to 1 .
– If P(1) = P(2) we have only fifty-fifty chance
of being right.
– Under these conditions, no other decision rules
can yield a larger probability of being right.
2.1 Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 5
• Improving the decision using observation:
2.1 Bayesian Decision Theory
• If we know the class –
conditional probability,
P(x | j), of an
observation x, we could
improve our decision.
• for example: x describes
the observed lightness of
the sea bass or salmon
P(x|w2)
P(x|w1)
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 6
• We can improve our decision by using this
observed feature and the Bayes rule :
– Posterior = (Likelihood x Prior) / Evidence
– Where, for C categories :




Cj
j
jj
PxPxP
1
)()|()( 
2.1 Bayesian Decision Theory
)(
)()|(
)|(
xP
PxP
xP jj
j

 
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 7
• Bayesian decision is based on minimizing the
probability of error , i.e. for a given feature
value x :
• The probability of error for a particular x is :
2.1 Bayesian Decision Theory
Decide x 1 if P(1 | x) > P(2 | x)
and x 2 if P(1 | x) < P(2 | x)
P(error | x) = min [ P(1 | x), P(2 | x) ]
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 8
fish (x)  2
Suppose P(1)=2/3=0.67, and P(2)=1/3= 0.33 ,
2.1 Bayesian Decision Theory: Numerical Example
P(x|w2)
P(x|w1)
0.36
0.15
If x = 11.5, then P(x|1)= 0.15 , P(x|2)= 0.36
P(x) = 0.15*0.67 + 0.36*0.33 = 0.22
P(1|x)= 0.15*0.67/0.22
= 0.46
P(2|x)= 0.36*0.33/0.22
= 0.54
fish  1
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 9
2.1 Bayesian Decision Theory
Computing
for all values
of x gives
decision
regions
(Rules) :
R2 R2R1 R1
• if x  R1
decide 1
• if x  R2
decide 2
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 10
• Draw Probability Densities and find the
decision regions for the following Classes:
 = {1, 2},
P(x | 1) ~ N(20, 4),
P(x | 2) ~ N(15, 2),
P(1) = 1/3, and P(2) = 2/3,
– Then Classify a sample with feature value x= 17.
Assignment 2.1
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 11
2.2 General Bayesian Decision Theory
• Generalization of Bayesian decision theory is
done by allowing the following:
– Having more than one feature.
– Having more than two states of nature.
– Allowing actions and not only decide on the
state of nature.
– Introduce a loss of function which is more
general than the probability of error.
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 12
• Allowing actions other than classification
primarily allows the possibility of rejection
• Rejection is refusing to make decision in close
or bad cases!
• The loss function states: how costly each
action taken is?
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 13
• Suppose we have c states of nature (categories)
 = { 1, 2,…, c } ,
• a feature vector:
x = { x1, x2,…, xd } ,
• the possible actions
 = { 1, 2,…, a } ,
• and the loss, (i | j ), incurred for taking
action i when the state of nature is j .
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 14
• The conditional risk, R(i | x), for select the action
i is given by:




cj
j
jjii xPxR
1
)|()|()|( 
• The Overall risk, R, is the Sum of all Conditional
risks R(i | x) for i = 1,…,a.




ai
i
i xRR
1
)|(
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 15
Take action i (i.e. decide i)
if R(i | x) < R(j | x) ;  j and j  i.
The Bayesian decision rule becomes: select
the action i for which the conditional risk,
R(i | x), is minimum. That is :
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 16
• Minimizing R(i | x) for all actions, that is: for
all i ; i = 1,…, a, is minimizing R.
• The overall risk R is the “expected loss
associated with a given decision rule”.
• The overall risk R is called the Bayes risk,
which defines the best performance that can
be achieved!
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 17
• Two-category classification Example:
Suppose we have two categories {1 ,2} and two
actions {1 ,2 }, where:
1 : deciding 1 , and 2 : deciding 2 ,
and for simplicity we write ij = (i | j )
The conditional risks for taking 1 and 2 are:
R(1 | x) = 11P(1 | x) + 12P(2 | x)
R(2 | x) = 21P(1 | x) + 22P(2 | x)
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 18
decide 1 (i.e. 1) if R(1 | x) < R(2 | x)
and 2 (i.e. 2) if R(1 | x) > R(2 | x)
There are a variety of ways to express the
minimum-risk rule, each has its advantage:
1- The fundamental rule is:
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 19
2- The rule in terms of the posteriors is:
3- The rule in terms of the priors and conditional
densities is:
decide 1 if (21- 11) P(1 | x ) > (12- 22) P(2 | x )
decide 2 otherwise
2.2 General Bayesian Decision Theory
decide 1 if
(21- 11) P(x | 1 ) P(1) > (12- 22) P(x | 2 ) P(2 )
decide 2 otherwise
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 20
4- The rule in terms of the likelihoods ratios:
That is, the Bayes (Optimal) decision can be
interpreted as:
2.2 General Bayesian Decision Theory
decide 1 if
decide 2 otherwise
)(
)(
.
)|(
)|(
1
2
1121
2212
2
1






P
P
xp
xp



“One can take an optimal decision, if the
likelihood ratio exceeds a threshold value
that is independent of the observation x”
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 21
• Decision regions depends on the values of the loss
function:
• For different loss function  we have:
)(
)(2
then
01
20
if
)(
)(
then
01
10
if
1
2
1
2






P
P
P
P
b
a






















 



)|(
)|(
:ifdecidethen
)(
)(
.Let
2
1
1
1
2
1121
2212
xp
xp
P
P
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 22
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 23
2.3 Minimum-Error Rate Classification
• Consider the zero-one (or symmetrical) loss
function:
• Therefore, the conditional risk is:
• In other words, for symmetric loss function, the
conditional risk is the probability of error.
cji
ji
ji
ji
,...,1,
1
0
),( 






 
 



1j
ij
cj
1j
jjii
)x|(P1)x|(P
)x|(P)|()x|(R
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 24
The Minmax Criterion
• Sometimes we need to design our classifier to
perform well over a range of prior probabilities, or
where we do not know the prior probabilities.
• A reasonable approach is to design our classifier so
that the worst overall risk for any value of the
priors is as small as possible
• Minimax Criterion:
“minimize the maximum possible overall
risk”
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 25
The Minmax Criterion
• It is found that the overall risk is linear in P(ωj).
Then, when the constant of proportionality (the
slope) is zero, the risk is independent of priors. This
condition gives the minmax risk Rmm as:
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 26
The Minmax Criterion
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 27
The Neyman-Pearson Criterion
• The Neynam-Pearson Criterion:
“minimize the overall risk subject to a
constraint”
• Generally Neyman-Pearson criterion is satisfied by
adjusting decision boundaries numerically.
However, for Gaussian and some other
distributions, its solution can be found analytically.
 R(αi|x) dx < constant
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 28
• Computer Exercises:
– Find the optimal decision for the following data:
 = {1, 2},
p(x | 1) ~ N(20, 4),
p(x | 2) ~ N(15, 2),
P(1) = 2/3, and P(2) = 1/3,
– With a loss function:
– Then classify the samples: x = 12, 17, 18, and 20.







12
.511

Assignment 2.2
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 29

Mais conteúdo relacionado

Mais procurados

Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...
Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...
Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...TEJVEER SINGH
 
Propositional And First-Order Logic
Propositional And First-Order LogicPropositional And First-Order Logic
Propositional And First-Order Logicankush_kumar
 
Fuzzy relations
Fuzzy relationsFuzzy relations
Fuzzy relationsnaugariya
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Markov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsFrancesco Casalegno
 
Lasso and ridge regression
Lasso and ridge regressionLasso and ridge regression
Lasso and ridge regressionSreerajVA
 
Introduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisIntroduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisJaclyn Kokx
 
Principal Component Analysis (PCA) and LDA PPT Slides
Principal Component Analysis (PCA) and LDA PPT SlidesPrincipal Component Analysis (PCA) and LDA PPT Slides
Principal Component Analysis (PCA) and LDA PPT SlidesAbhishekKumar4995
 
Inference in First-Order Logic
Inference in First-Order Logic Inference in First-Order Logic
Inference in First-Order Logic Junya Tanaka
 
Lecture 18: Gaussian Mixture Models and Expectation Maximization
Lecture 18: Gaussian Mixture Models and Expectation MaximizationLecture 18: Gaussian Mixture Models and Expectation Maximization
Lecture 18: Gaussian Mixture Models and Expectation Maximizationbutest
 
Introduction of Xgboost
Introduction of XgboostIntroduction of Xgboost
Introduction of Xgboostmichiaki ito
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepAhmed Gad
 
Bellman ford Algorithm
Bellman ford AlgorithmBellman ford Algorithm
Bellman ford Algorithmtaimurkhan803
 
Linear models for classification
Linear models for classificationLinear models for classification
Linear models for classificationSung Yub Kim
 
Bayseian decision theory
Bayseian decision theoryBayseian decision theory
Bayseian decision theorysia16
 

Mais procurados (20)

Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...
Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...
Design principle of pattern recognition system and STATISTICAL PATTERN RECOGN...
 
Propositional And First-Order Logic
Propositional And First-Order LogicPropositional And First-Order Logic
Propositional And First-Order Logic
 
Fuzzy relations
Fuzzy relationsFuzzy relations
Fuzzy relations
 
CSC446: Pattern Recognition (LN3)
CSC446: Pattern Recognition (LN3)CSC446: Pattern Recognition (LN3)
CSC446: Pattern Recognition (LN3)
 
Ensemble methods
Ensemble methods Ensemble methods
Ensemble methods
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Markov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo Methods
 
Fuzzy Clustering(C-means, K-means)
Fuzzy Clustering(C-means, K-means)Fuzzy Clustering(C-means, K-means)
Fuzzy Clustering(C-means, K-means)
 
Lasso and ridge regression
Lasso and ridge regressionLasso and ridge regression
Lasso and ridge regression
 
Introduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisIntroduction to Linear Discriminant Analysis
Introduction to Linear Discriminant Analysis
 
Principal Component Analysis (PCA) and LDA PPT Slides
Principal Component Analysis (PCA) and LDA PPT SlidesPrincipal Component Analysis (PCA) and LDA PPT Slides
Principal Component Analysis (PCA) and LDA PPT Slides
 
Inference in First-Order Logic
Inference in First-Order Logic Inference in First-Order Logic
Inference in First-Order Logic
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
Lecture 18: Gaussian Mixture Models and Expectation Maximization
Lecture 18: Gaussian Mixture Models and Expectation MaximizationLecture 18: Gaussian Mixture Models and Expectation Maximization
Lecture 18: Gaussian Mixture Models and Expectation Maximization
 
Introduction of Xgboost
Introduction of XgboostIntroduction of Xgboost
Introduction of Xgboost
 
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-StepBackpropagation: Understanding How to Update ANNs Weights Step-by-Step
Backpropagation: Understanding How to Update ANNs Weights Step-by-Step
 
Bellman ford Algorithm
Bellman ford AlgorithmBellman ford Algorithm
Bellman ford Algorithm
 
Machine learning
Machine learningMachine learning
Machine learning
 
Linear models for classification
Linear models for classificationLinear models for classification
Linear models for classification
 
Bayseian decision theory
Bayseian decision theoryBayseian decision theory
Bayseian decision theory
 

Semelhante a Bay Area Real Estate Guide

2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decision2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decisionnozomuhamada
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersMohammed Bennamoun
 
Probabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance ConstraintsProbabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance ConstraintsLeo Asselborn
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1arogozhnikov
 
Some Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic ProgrammingSome Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic ProgrammingWaqas Tariq
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selectionguasoni
 
Bayesian decesion theory
Bayesian decesion theoryBayesian decesion theory
Bayesian decesion theoryVARUN KUMAR
 
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and ClimateEstimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and Climatemodons
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machinesMostafa G. M. Mostafa
 
IVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsIVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsCharles Deledalle
 
Interval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision makingInterval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision makingBob John
 

Semelhante a Bay Area Real Estate Guide (20)

2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decision2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decision
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
 
Probabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance ConstraintsProbabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance Constraints
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
 
Some Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic ProgrammingSome Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selection
 
Slides univ-van-amsterdam
Slides univ-van-amsterdamSlides univ-van-amsterdam
Slides univ-van-amsterdam
 
Bayesian decesion theory
Bayesian decesion theoryBayesian decesion theory
Bayesian decesion theory
 
HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016
 
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and ClimateEstimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
 
pattern recognition
pattern recognition pattern recognition
pattern recognition
 
CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)
 
2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...
2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...
2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...
 
Csc446: Pattern Recognition
Csc446: Pattern Recognition Csc446: Pattern Recognition
Csc446: Pattern Recognition
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machines
 
Lec37
Lec37Lec37
Lec37
 
IVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsIVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methods
 
Bayes ML.ppt
Bayes ML.pptBayes ML.ppt
Bayes ML.ppt
 
Interval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision makingInterval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision making
 

Mais de Mostafa G. M. Mostafa

Digital Image Processing: Image Restoration
Digital Image Processing: Image RestorationDigital Image Processing: Image Restoration
Digital Image Processing: Image RestorationMostafa G. M. Mostafa
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationMostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial DomainDigital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial DomainMostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainDigital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainMostafa G. M. Mostafa
 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsMostafa G. M. Mostafa
 
Digital Image Processing: An Introduction
Digital Image Processing: An IntroductionDigital Image Processing: An Introduction
Digital Image Processing: An IntroductionMostafa G. M. Mostafa
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmMostafa G. M. Mostafa
 
Neural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionNeural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionMostafa G. M. Mostafa
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronMostafa G. M. Mostafa
 
Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Mostafa G. M. Mostafa
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)Mostafa G. M. Mostafa
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Mostafa G. M. Mostafa
 

Mais de Mostafa G. M. Mostafa (14)

Csc446: Pattren Recognition
Csc446: Pattren RecognitionCsc446: Pattren Recognition
Csc446: Pattren Recognition
 
Digital Image Processing: Image Restoration
Digital Image Processing: Image RestorationDigital Image Processing: Image Restoration
Digital Image Processing: Image Restoration
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image Segmentation
 
Digital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial DomainDigital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial Domain
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainDigital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency Domain
 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image Fundamentals
 
Digital Image Processing: An Introduction
Digital Image Processing: An IntroductionDigital Image Processing: An Introduction
Digital Image Processing: An Introduction
 
Neural Networks: Introducton
Neural Networks: IntroductonNeural Networks: Introducton
Neural Networks: Introducton
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) Algorithm
 
Neural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionNeural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear Regression
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer Perceptron
 
Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)
 

Último

Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...amitlee9823
 
Mature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxMature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxolyaivanovalion
 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxolyaivanovalion
 
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort ServiceBDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort ServiceDelhi Call girls
 
Cheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 night
Cheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 nightCheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 night
Cheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 nightDelhi Call girls
 
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...amitlee9823
 
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfMarket Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfRachmat Ramadhan H
 
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% SecureCall me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% SecurePooja Nehwal
 
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779Delhi Call girls
 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxMohammedJunaid861692
 
Introduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxIntroduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxfirstjob4
 
Data-Analysis for Chicago Crime Data 2023
Data-Analysis for Chicago Crime Data  2023Data-Analysis for Chicago Crime Data  2023
Data-Analysis for Chicago Crime Data 2023ymrp368
 
Halmar dropshipping via API with DroFx
Halmar  dropshipping  via API with DroFxHalmar  dropshipping  via API with DroFx
Halmar dropshipping via API with DroFxolyaivanovalion
 
Edukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFxEdukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFxolyaivanovalion
 
Vip Model Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...
Vip Model  Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...Vip Model  Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...
Vip Model Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...shivangimorya083
 
100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptxAnupama Kate
 
Log Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptxLog Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptxJohnnyPlasten
 

Último (20)

Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
 
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in  KishangarhDelhi 99530 vip 56974 Genuine Escort Service Call Girls in  Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
 
Mature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptxMature dropshipping via API with DroFx.pptx
Mature dropshipping via API with DroFx.pptx
 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptx
 
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort ServiceBDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
 
Cheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 night
Cheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 nightCheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 night
Cheap Rate Call girls Sarita Vihar Delhi 9205541914 shot 1500 night
 
Sampling (random) method and Non random.ppt
Sampling (random) method and Non random.pptSampling (random) method and Non random.ppt
Sampling (random) method and Non random.ppt
 
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
 
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfMarket Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
 
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% SecureCall me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
 
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
Best VIP Call Girls Noida Sector 22 Call Me: 8448380779
 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
 
Introduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxIntroduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptx
 
Data-Analysis for Chicago Crime Data 2023
Data-Analysis for Chicago Crime Data  2023Data-Analysis for Chicago Crime Data  2023
Data-Analysis for Chicago Crime Data 2023
 
Halmar dropshipping via API with DroFx
Halmar  dropshipping  via API with DroFxHalmar  dropshipping  via API with DroFx
Halmar dropshipping via API with DroFx
 
Edukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFxEdukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFx
 
Vip Model Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...
Vip Model  Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...Vip Model  Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...
Vip Model Call Girls (Delhi) Karol Bagh 9711199171✔️Body to body massage wit...
 
100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx
 
Log Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptxLog Analysis using OSSEC sasoasasasas.pptx
Log Analysis using OSSEC sasoasasasas.pptx
 
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICECHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
 

Bay Area Real Estate Guide

  • 1. Bayesian Decision Theory Prof. Dr. Mostafa Gadal-Haqq Faculty of Computer & Information Sciences Computer Science Department AIN SHAMS UNIVERSITY ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 1 CSC446 : Pattern Recognition (Pattern Classifications, Ch2: Sec. 2.1 to Sec. 2.3)
  • 2. 2.1 Bayesian Decision Theory • Bayesian Decision Theory is based on quantifying the trade-offs between various classification decisions using probabilities and the costs that accompany such decisions. • Assumes that: The decision problem is posed in probabilistic terms and that all of the relevant probability values are known. ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 2
  • 3. 2.1 Bayesian Decision Theory • Back to the Fish Sorting Machine: –  = a random variable (State of nature)={1 ,2} • For example: 1 = Sea bass, and 2 = Salmon • P(1 ) = the prior (a priori probability) that the coming fish is sea bass. • P(2 ) = the prior (a priori probability) that the coming fish is salmon. – The priors gives us the knowledge of how likely we are to get salmon or Sea bass before the fish actually appears. ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 3
  • 4. • Decision Rule Using Priors only: – to make a decision about the fish that will appear using only the priors, P(1) and P(2), We use the following decision rule: – which minimize the error. 2.1 Bayesian Decision Theory Decide fish 1 if P(1) > P(2) and fish 2 if P(1) < P(2) Probability of error = min [ P(1) , P(2)] ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 4
  • 5. • That is: – If P(1) >> P(2) we will be right most of the time when we decide that the fish belong to 1 . – If P(1) = P(2) we have only fifty-fifty chance of being right. – Under these conditions, no other decision rules can yield a larger probability of being right. 2.1 Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 5
  • 6. • Improving the decision using observation: 2.1 Bayesian Decision Theory • If we know the class – conditional probability, P(x | j), of an observation x, we could improve our decision. • for example: x describes the observed lightness of the sea bass or salmon P(x|w2) P(x|w1) ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 6
  • 7. • We can improve our decision by using this observed feature and the Bayes rule : – Posterior = (Likelihood x Prior) / Evidence – Where, for C categories :     Cj j jj PxPxP 1 )()|()(  2.1 Bayesian Decision Theory )( )()|( )|( xP PxP xP jj j    ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 7
  • 8. • Bayesian decision is based on minimizing the probability of error , i.e. for a given feature value x : • The probability of error for a particular x is : 2.1 Bayesian Decision Theory Decide x 1 if P(1 | x) > P(2 | x) and x 2 if P(1 | x) < P(2 | x) P(error | x) = min [ P(1 | x), P(2 | x) ] ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 8
  • 9. fish (x)  2 Suppose P(1)=2/3=0.67, and P(2)=1/3= 0.33 , 2.1 Bayesian Decision Theory: Numerical Example P(x|w2) P(x|w1) 0.36 0.15 If x = 11.5, then P(x|1)= 0.15 , P(x|2)= 0.36 P(x) = 0.15*0.67 + 0.36*0.33 = 0.22 P(1|x)= 0.15*0.67/0.22 = 0.46 P(2|x)= 0.36*0.33/0.22 = 0.54 fish  1 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 9
  • 10. 2.1 Bayesian Decision Theory Computing for all values of x gives decision regions (Rules) : R2 R2R1 R1 • if x  R1 decide 1 • if x  R2 decide 2 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 10
  • 11. • Draw Probability Densities and find the decision regions for the following Classes:  = {1, 2}, P(x | 1) ~ N(20, 4), P(x | 2) ~ N(15, 2), P(1) = 1/3, and P(2) = 2/3, – Then Classify a sample with feature value x= 17. Assignment 2.1 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 11
  • 12. 2.2 General Bayesian Decision Theory • Generalization of Bayesian decision theory is done by allowing the following: – Having more than one feature. – Having more than two states of nature. – Allowing actions and not only decide on the state of nature. – Introduce a loss of function which is more general than the probability of error. ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 12
  • 13. • Allowing actions other than classification primarily allows the possibility of rejection • Rejection is refusing to make decision in close or bad cases! • The loss function states: how costly each action taken is? 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 13
  • 14. • Suppose we have c states of nature (categories)  = { 1, 2,…, c } , • a feature vector: x = { x1, x2,…, xd } , • the possible actions  = { 1, 2,…, a } , • and the loss, (i | j ), incurred for taking action i when the state of nature is j . 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 14
  • 15. • The conditional risk, R(i | x), for select the action i is given by:     cj j jjii xPxR 1 )|()|()|(  • The Overall risk, R, is the Sum of all Conditional risks R(i | x) for i = 1,…,a.     ai i i xRR 1 )|( 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 15
  • 16. Take action i (i.e. decide i) if R(i | x) < R(j | x) ;  j and j  i. The Bayesian decision rule becomes: select the action i for which the conditional risk, R(i | x), is minimum. That is : 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 16
  • 17. • Minimizing R(i | x) for all actions, that is: for all i ; i = 1,…, a, is minimizing R. • The overall risk R is the “expected loss associated with a given decision rule”. • The overall risk R is called the Bayes risk, which defines the best performance that can be achieved! 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 17
  • 18. • Two-category classification Example: Suppose we have two categories {1 ,2} and two actions {1 ,2 }, where: 1 : deciding 1 , and 2 : deciding 2 , and for simplicity we write ij = (i | j ) The conditional risks for taking 1 and 2 are: R(1 | x) = 11P(1 | x) + 12P(2 | x) R(2 | x) = 21P(1 | x) + 22P(2 | x) 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 18
  • 19. decide 1 (i.e. 1) if R(1 | x) < R(2 | x) and 2 (i.e. 2) if R(1 | x) > R(2 | x) There are a variety of ways to express the minimum-risk rule, each has its advantage: 1- The fundamental rule is: 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 19
  • 20. 2- The rule in terms of the posteriors is: 3- The rule in terms of the priors and conditional densities is: decide 1 if (21- 11) P(1 | x ) > (12- 22) P(2 | x ) decide 2 otherwise 2.2 General Bayesian Decision Theory decide 1 if (21- 11) P(x | 1 ) P(1) > (12- 22) P(x | 2 ) P(2 ) decide 2 otherwise ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 20
  • 21. 4- The rule in terms of the likelihoods ratios: That is, the Bayes (Optimal) decision can be interpreted as: 2.2 General Bayesian Decision Theory decide 1 if decide 2 otherwise )( )( . )|( )|( 1 2 1121 2212 2 1       P P xp xp    “One can take an optimal decision, if the likelihood ratio exceeds a threshold value that is independent of the observation x” ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 21
  • 22. • Decision regions depends on the values of the loss function: • For different loss function  we have: )( )(2 then 01 20 if )( )( then 01 10 if 1 2 1 2       P P P P b a                            )|( )|( :ifdecidethen )( )( .Let 2 1 1 1 2 1121 2212 xp xp P P 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 22
  • 23. 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 23
  • 24. 2.3 Minimum-Error Rate Classification • Consider the zero-one (or symmetrical) loss function: • Therefore, the conditional risk is: • In other words, for symmetric loss function, the conditional risk is the probability of error. cji ji ji ji ,...,1, 1 0 ),(               1j ij cj 1j jjii )x|(P1)x|(P )x|(P)|()x|(R ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 24
  • 25. The Minmax Criterion • Sometimes we need to design our classifier to perform well over a range of prior probabilities, or where we do not know the prior probabilities. • A reasonable approach is to design our classifier so that the worst overall risk for any value of the priors is as small as possible • Minimax Criterion: “minimize the maximum possible overall risk” ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 25
  • 26. The Minmax Criterion • It is found that the overall risk is linear in P(ωj). Then, when the constant of proportionality (the slope) is zero, the risk is independent of priors. This condition gives the minmax risk Rmm as: ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 26
  • 27. The Minmax Criterion ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 27
  • 28. The Neyman-Pearson Criterion • The Neynam-Pearson Criterion: “minimize the overall risk subject to a constraint” • Generally Neyman-Pearson criterion is satisfied by adjusting decision boundaries numerically. However, for Gaussian and some other distributions, its solution can be found analytically.  R(αi|x) dx < constant ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 28
  • 29. • Computer Exercises: – Find the optimal decision for the following data:  = {1, 2}, p(x | 1) ~ N(20, 4), p(x | 2) ~ N(15, 2), P(1) = 2/3, and P(2) = 1/3, – With a loss function: – Then classify the samples: x = 12, 17, 18, and 20.        12 .511  Assignment 2.2 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 29