SlideShare uma empresa Scribd logo
1 de 19
Naïve Bayes Classifier
Ke Chen
http://intranet.cs.man.ac.uk/mlo/comp20411/
Modified and extended by Longin Jan Latecki
latecki@temple.edu
2
Outline
• Background
• Probability Basics
• Probabilistic Classification
• Naïve Bayes
• Example: Play Tennis
• Relevant Issues
• Conclusions
3
Background
• There are three methods to establish a classifier
a) Model a classification rule directly
Examples: k-NN, decision trees, perceptron, SVM
b) Model the probability of class memberships given input data
Example: multi-layered perceptron with the cross-entropy cost
c) Make a probabilistic model of data within each class
Examples: naive Bayes, model based classifiers
• a) and b) are examples of discriminative classification
• c) is an example of generative classification
• b) and c) are both examples of probabilistic classification
4
Probability Basics
• Prior, conditional and joint probability
– Prior probability:
– Conditional probability:
– Joint probability:
– Relationship:
– Independence:
• Bayesian Rule
)
|
,
)
( 1
2
1 X
P(X
X
|
X
P 2
)
(
)
(
)
(
)
(
X
X
X
P
C
P
C
|
P
|
C
P 
)
(X
P
)
)
(
),
,
( 2
2 ,X
P(X
P
X
X 1
1 
 X
X
)
(
)
|
(
)
(
)
|
(
) 2
2
1
1
1
2
2 X
P
X
X
P
X
P
X
X
P
,X
P(X1 

)
(
)
(
)
),
(
)
|
(
),
(
)
|
( 2
1
2
1
2
1
2
1
2 X
P
X
P
,X
P(X
X
P
X
X
P
X
P
X
X
P 1 


Evidence
Prior
Likelihood
Posterior


Example by Dieter Fox
8
Probabilistic Classification
• Establishing a probabilistic model for classification
– Discriminative model
– Generative model
• MAP classification rule
– MAP: Maximum A Posterior
– Assign x to c* if
• Generative classification with the MAP rule
– Apply Bayesian rule to convert:
)
,
,
,
)
( 1 n
1
L X
(X
c
,
,
c
C
|
C
P 






 X
X
)
,
,
,
)
( 1 n
1
L X
(X
c
,
,
c
C
C
|
P 






 X
X
L
c
,
,
c
c
c
c
|
c
C
P
|
c
C
P 








 1
*
*
,
)
(
)
( x
X
x
X
)
(
)
(
)
(
)
(
)
(
)
( C
P
C
|
P
P
C
P
C
|
P
|
C
P X
X
X
X 

Feature Histograms
x
C1
C2
P(x)
Slide by Stephen Marsland
Posterior Probability
x
P(C|x)
1
0
Slide by Stephen Marsland
11
Naïve Bayes
• Bayes classification
Difficulty: learning the joint probability
• Naïve Bayes classification
– Making the assumption that all input attributes are independent
– MAP classification rule
)
(
)
|
,
,
(
)
(
)
(
)
( 1 C
P
C
X
X
P
C
P
C
|
P
|
C
P n




 X
X
)
|
,
,
( 1 C
X
X
P n



)
|
(
)
|
(
)
|
(
)
|
,
,
(
)
|
(
)
|
,
,
(
)
;
,
,
|
(
)
|
,
,
,
(
2
1
2
1
2
2
1
2
1
C
X
P
C
X
P
C
X
P
C
X
X
P
C
X
P
C
X
X
P
C
X
X
X
P
C
X
X
X
P
n
n
n
n
n


















L
n
n c
c
c
c
c
c
P
c
x
P
c
x
P
c
P
c
x
P
c
x
P ,
,
,
),
(
)]
|
(
)
|
(
[
)
(
)]
|
(
)
|
(
[ 1
*
1
*
*
*
1 











12
Naïve Bayes
• Naïve Bayes Algorithm (for discrete input attributes)
– Learning Phase: Given a training set S,
Output: conditional probability tables; for elements
– Test Phase: Given an unknown instance ,
Look up tables to assign the label c* to X’ if
;
in
examples
with
)
|
(
estimate
)
|
(
ˆ
)
,
1
;
,
,
1
(
attribute
each
of
value
attribute
every
For
;
in
examples
with
)
(
estimate
)
(
ˆ
of
value
target
each
For 1
S
S
i
jk
j
i
jk
j
j
j
jk
i
i
L
i
i
c
C
a
X
P
c
C
a
X
P
N
,
k
n
j
x
a
c
C
P
c
C
P
)
c
,
,
c
(c
c




















L
n
n c
c
c
c
c
c
P
c
a
P
c
a
P
c
P
c
a
P
c
a
P ,
,
,
),
(
ˆ
)]
|
(
ˆ
)
|
(
ˆ
[
)
(
ˆ
)]
|
(
ˆ
)
|
(
ˆ
[ 1
*
1
*
*
*
1 















)
,
,
( 1 n
a
a 






X
L
N
x j
j 
,
13
Example
• Example: Play Tennis
14
Learning Phase
Outlook Play=Yes Play=No
Sunny 2/9 3/5
Overcast 4/9 0/5
Rain 3/9 2/5
Temperature Play=Yes Play=No
Hot 2/9 2/5
Mild 4/9 2/5
Cool 3/9 1/5
Humidity Play=Yes Play=No
High 3/9 4/5
Normal 6/9 1/5
Wind Play=Yes Play=No
Strong 3/9 3/5
Weak 6/9 2/5
P(Play=Yes) = 9/14 P(Play=No) = 5/14
P(Outlook=o|Play=b) P(Temperature=t|Play=b)
P(Humidity=h|Play=b) P(Wind=w|Play=b)
15
Example
• Test Phase
– Given a new instance,
x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong)
– Look up tables
– MAP rule
P(Outlook=Sunny|Play=No) = 3/5
P(Temperature=Cool|Play==No) = 1/5
P(Huminity=High|Play=No) = 4/5
P(Wind=Strong|Play=No) = 3/5
P(Play=No) = 5/14
P(Outlook=Sunny|Play=Yes) = 2/9
P(Temperature=Cool|Play=Yes) = 3/9
P(Huminity=High|Play=Yes) = 3/9
P(Wind=Strong|Play=Yes) = 3/9
P(Play=Yes) = 9/14
P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053
P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206
Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.
16
Relevant Issues
• Violation of Independence Assumption
– For many real world tasks,
– Nevertheless, naïve Bayes works surprisingly well anyway!
• Zero conditional probability Problem
– If no example contains the attribute value
– In this circumstance, during test
– For a remedy, conditional probabilities estimated with Laplace
smoothing:
)
|
(
)
|
(
)
|
,
,
( 1
1 C
X
P
C
X
P
C
X
X
P n
n 






0
)
|
(
ˆ
, 


 i
jk
j
jk
j c
C
a
X
P
a
X
0
)
|
(
ˆ
)
|
(
ˆ
)
|
(
ˆ
1 





 i
n
i
jk
i c
x
P
c
a
P
c
x
P
)
1
examples,
virtual"
"
of
(number
prior
to
weight
:
)
of
values
possible
for
/
1
(usually,
estimate
prior
:
which
for
examples
training
of
number
:
C
and
which
for
examples
training
of
number
:
)
|
(
ˆ










m
m
X
t
t
p
p
c
C
n
c
a
X
n
m
n
mp
n
c
C
a
X
P
j
i
i
jk
j
c
c
i
jk
j
Homework
• Redo the test on Slide 15 using the formula on Slide 16
with m=1.
• Compute P(Play=Yes|x’) and P(Play=No|x’) with m=0
and with m=1 for
x’=(Outlook=Overcast, Temperature=Cool, Humidity=High, Wind=Strong)
Does the result change?
17
18
Relevant Issues
• Continuous-valued Input Attributes
– Numberless values for an attribute
– Conditional probability modeled with the normal distribution
– Learning Phase:
Output: normal distributions and
– Test Phase:
• Calculate conditional probabilities with all the normal distributions
• Apply the MAP rule to make a decision
i
j
ji
i
j
ji
ji
ji
j
ji
i
j
c
C
c
X
X
c
C
X
P









 



which
for
examples
of
X
values
attribute
of
deviation
standard
:
C
which
for
examples
of
values
attribute
of
(avearage)
mean
:
2
)
(
exp
2
1
)
|
(
ˆ
2
2






L
n c
c
C
X
X ,
,
),
,
,
(
for 1
1 







X
L
n
)
,
,
(
for 1 n
X
X 






X
L
i
c
C
P i ,
,
1
)
( 




19
Conclusions
• Naïve Bayes based on the independence assumption
– Training is very easy and fast; just requiring considering each
attribute in each class separately
– Test is straightforward; just looking up tables or calculating
conditional probabilities with normal distributions
• A popular generative model
– Performance competitive to most of state-of-the-art classifiers
even in presence of violating independence assumption
– Many successful applications, e.g., spam mail filtering
– Apart from classification, naïve Bayes can do more…

Mais conteúdo relacionado

Semelhante a ch8Bayes.ppt

GonzalezGinestetResearchDay2016
GonzalezGinestetResearchDay2016GonzalezGinestetResearchDay2016
GonzalezGinestetResearchDay2016
Pablo Ginestet
 

Semelhante a ch8Bayes.ppt (20)

naive bayes example.pdf
naive bayes example.pdfnaive bayes example.pdf
naive bayes example.pdf
 
naive bayes example.pdf
naive bayes example.pdfnaive bayes example.pdf
naive bayes example.pdf
 
Methods of Manifold Learning for Dimension Reduction of Large Data Sets
Methods of Manifold Learning for Dimension Reduction of Large Data SetsMethods of Manifold Learning for Dimension Reduction of Large Data Sets
Methods of Manifold Learning for Dimension Reduction of Large Data Sets
 
baysian in machine learning in Supervised Learning .pptx
baysian in machine learning in Supervised Learning .pptxbaysian in machine learning in Supervised Learning .pptx
baysian in machine learning in Supervised Learning .pptx
 
Bayes 6
Bayes 6Bayes 6
Bayes 6
 
PAC Bayesian for Deep Learning
PAC Bayesian for Deep LearningPAC Bayesian for Deep Learning
PAC Bayesian for Deep Learning
 
Bayesnetwork
BayesnetworkBayesnetwork
Bayesnetwork
 
Risk Classification with an Adaptive Naive Bayes Kernel Machine Model
Risk Classification with an Adaptive Naive Bayes Kernel Machine ModelRisk Classification with an Adaptive Naive Bayes Kernel Machine Model
Risk Classification with an Adaptive Naive Bayes Kernel Machine Model
 
Clustering:k-means, expect-maximization and gaussian mixture model
Clustering:k-means, expect-maximization and gaussian mixture modelClustering:k-means, expect-maximization and gaussian mixture model
Clustering:k-means, expect-maximization and gaussian mixture model
 
GonzalezGinestetResearchDay2016
GonzalezGinestetResearchDay2016GonzalezGinestetResearchDay2016
GonzalezGinestetResearchDay2016
 
Lec05.pptx
Lec05.pptxLec05.pptx
Lec05.pptx
 
seminar at Princeton University
seminar at Princeton Universityseminar at Princeton University
seminar at Princeton University
 
Bayesian network
Bayesian networkBayesian network
Bayesian network
 
Edinburgh, Bayes-250
Edinburgh, Bayes-250Edinburgh, Bayes-250
Edinburgh, Bayes-250
 
Elements of Statistical Learning 読み会 第2章
Elements of Statistical Learning 読み会 第2章Elements of Statistical Learning 読み会 第2章
Elements of Statistical Learning 読み会 第2章
 
Kernel Bayes Rule
Kernel Bayes RuleKernel Bayes Rule
Kernel Bayes Rule
 
Workshop on Bayesian Inference for Latent Gaussian Models with Applications
Workshop on Bayesian Inference for Latent Gaussian Models with ApplicationsWorkshop on Bayesian Inference for Latent Gaussian Models with Applications
Workshop on Bayesian Inference for Latent Gaussian Models with Applications
 
Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]Columbia workshop [ABC model choice]
Columbia workshop [ABC model choice]
 
Lec12-Probability (1).ppt
Lec12-Probability (1).pptLec12-Probability (1).ppt
Lec12-Probability (1).ppt
 
Lec12-Probability.ppt
Lec12-Probability.pptLec12-Probability.ppt
Lec12-Probability.ppt
 

Último

DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
MayuraD1
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
AldoGarca30
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
mphochane1998
 

Último (20)

Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . ppt
 
kiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadkiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal load
 
Wadi Rum luxhotel lodge Analysis case study.pptx
Wadi Rum luxhotel lodge Analysis case study.pptxWadi Rum luxhotel lodge Analysis case study.pptx
Wadi Rum luxhotel lodge Analysis case study.pptx
 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torque
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdf
 
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
 
Moment Distribution Method For Btech Civil
Moment Distribution Method For Btech CivilMoment Distribution Method For Btech Civil
Moment Distribution Method For Btech Civil
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
 
AIRCANVAS[1].pdf mini project for btech students
AIRCANVAS[1].pdf mini project for btech studentsAIRCANVAS[1].pdf mini project for btech students
AIRCANVAS[1].pdf mini project for btech students
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
 
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
 
Hostel management system project report..pdf
Hostel management system project report..pdfHostel management system project report..pdf
Hostel management system project report..pdf
 
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptxHOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS Lambda
 
Engineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesEngineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planes
 
Employee leave management system project.
Employee leave management system project.Employee leave management system project.
Employee leave management system project.
 

ch8Bayes.ppt

  • 1. Naïve Bayes Classifier Ke Chen http://intranet.cs.man.ac.uk/mlo/comp20411/ Modified and extended by Longin Jan Latecki latecki@temple.edu
  • 2. 2 Outline • Background • Probability Basics • Probabilistic Classification • Naïve Bayes • Example: Play Tennis • Relevant Issues • Conclusions
  • 3. 3 Background • There are three methods to establish a classifier a) Model a classification rule directly Examples: k-NN, decision trees, perceptron, SVM b) Model the probability of class memberships given input data Example: multi-layered perceptron with the cross-entropy cost c) Make a probabilistic model of data within each class Examples: naive Bayes, model based classifiers • a) and b) are examples of discriminative classification • c) is an example of generative classification • b) and c) are both examples of probabilistic classification
  • 4. 4 Probability Basics • Prior, conditional and joint probability – Prior probability: – Conditional probability: – Joint probability: – Relationship: – Independence: • Bayesian Rule ) | , ) ( 1 2 1 X P(X X | X P 2 ) ( ) ( ) ( ) ( X X X P C P C | P | C P  ) (X P ) ) ( ), , ( 2 2 ,X P(X P X X 1 1   X X ) ( ) | ( ) ( ) | ( ) 2 2 1 1 1 2 2 X P X X P X P X X P ,X P(X1   ) ( ) ( ) ), ( ) | ( ), ( ) | ( 2 1 2 1 2 1 2 1 2 X P X P ,X P(X X P X X P X P X X P 1    Evidence Prior Likelihood Posterior  
  • 6.
  • 7.
  • 8. 8 Probabilistic Classification • Establishing a probabilistic model for classification – Discriminative model – Generative model • MAP classification rule – MAP: Maximum A Posterior – Assign x to c* if • Generative classification with the MAP rule – Apply Bayesian rule to convert: ) , , , ) ( 1 n 1 L X (X c , , c C | C P         X X ) , , , ) ( 1 n 1 L X (X c , , c C C | P         X X L c , , c c c c | c C P | c C P           1 * * , ) ( ) ( x X x X ) ( ) ( ) ( ) ( ) ( ) ( C P C | P P C P C | P | C P X X X X  
  • 11. 11 Naïve Bayes • Bayes classification Difficulty: learning the joint probability • Naïve Bayes classification – Making the assumption that all input attributes are independent – MAP classification rule ) ( ) | , , ( ) ( ) ( ) ( 1 C P C X X P C P C | P | C P n      X X ) | , , ( 1 C X X P n    ) | ( ) | ( ) | ( ) | , , ( ) | ( ) | , , ( ) ; , , | ( ) | , , , ( 2 1 2 1 2 2 1 2 1 C X P C X P C X P C X X P C X P C X X P C X X X P C X X X P n n n n n                   L n n c c c c c c P c x P c x P c P c x P c x P , , , ), ( )] | ( ) | ( [ ) ( )] | ( ) | ( [ 1 * 1 * * * 1            
  • 12. 12 Naïve Bayes • Naïve Bayes Algorithm (for discrete input attributes) – Learning Phase: Given a training set S, Output: conditional probability tables; for elements – Test Phase: Given an unknown instance , Look up tables to assign the label c* to X’ if ; in examples with ) | ( estimate ) | ( ˆ ) , 1 ; , , 1 ( attribute each of value attribute every For ; in examples with ) ( estimate ) ( ˆ of value target each For 1 S S i jk j i jk j j j jk i i L i i c C a X P c C a X P N , k n j x a c C P c C P ) c , , c (c c                     L n n c c c c c c P c a P c a P c P c a P c a P , , , ), ( ˆ )] | ( ˆ ) | ( ˆ [ ) ( ˆ )] | ( ˆ ) | ( ˆ [ 1 * 1 * * * 1                 ) , , ( 1 n a a        X L N x j j  ,
  • 14. 14 Learning Phase Outlook Play=Yes Play=No Sunny 2/9 3/5 Overcast 4/9 0/5 Rain 3/9 2/5 Temperature Play=Yes Play=No Hot 2/9 2/5 Mild 4/9 2/5 Cool 3/9 1/5 Humidity Play=Yes Play=No High 3/9 4/5 Normal 6/9 1/5 Wind Play=Yes Play=No Strong 3/9 3/5 Weak 6/9 2/5 P(Play=Yes) = 9/14 P(Play=No) = 5/14 P(Outlook=o|Play=b) P(Temperature=t|Play=b) P(Humidity=h|Play=b) P(Wind=w|Play=b)
  • 15. 15 Example • Test Phase – Given a new instance, x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) – Look up tables – MAP rule P(Outlook=Sunny|Play=No) = 3/5 P(Temperature=Cool|Play==No) = 1/5 P(Huminity=High|Play=No) = 4/5 P(Wind=Strong|Play=No) = 3/5 P(Play=No) = 5/14 P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=Yes) = 3/9 P(Play=Yes) = 9/14 P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053 P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206 Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.
  • 16. 16 Relevant Issues • Violation of Independence Assumption – For many real world tasks, – Nevertheless, naïve Bayes works surprisingly well anyway! • Zero conditional probability Problem – If no example contains the attribute value – In this circumstance, during test – For a remedy, conditional probabilities estimated with Laplace smoothing: ) | ( ) | ( ) | , , ( 1 1 C X P C X P C X X P n n        0 ) | ( ˆ ,     i jk j jk j c C a X P a X 0 ) | ( ˆ ) | ( ˆ ) | ( ˆ 1        i n i jk i c x P c a P c x P ) 1 examples, virtual" " of (number prior to weight : ) of values possible for / 1 (usually, estimate prior : which for examples training of number : C and which for examples training of number : ) | ( ˆ           m m X t t p p c C n c a X n m n mp n c C a X P j i i jk j c c i jk j
  • 17. Homework • Redo the test on Slide 15 using the formula on Slide 16 with m=1. • Compute P(Play=Yes|x’) and P(Play=No|x’) with m=0 and with m=1 for x’=(Outlook=Overcast, Temperature=Cool, Humidity=High, Wind=Strong) Does the result change? 17
  • 18. 18 Relevant Issues • Continuous-valued Input Attributes – Numberless values for an attribute – Conditional probability modeled with the normal distribution – Learning Phase: Output: normal distributions and – Test Phase: • Calculate conditional probabilities with all the normal distributions • Apply the MAP rule to make a decision i j ji i j ji ji ji j ji i j c C c X X c C X P               which for examples of X values attribute of deviation standard : C which for examples of values attribute of (avearage) mean : 2 ) ( exp 2 1 ) | ( ˆ 2 2       L n c c C X X , , ), , , ( for 1 1         X L n ) , , ( for 1 n X X        X L i c C P i , , 1 ) (     
  • 19. 19 Conclusions • Naïve Bayes based on the independence assumption – Training is very easy and fast; just requiring considering each attribute in each class separately – Test is straightforward; just looking up tables or calculating conditional probabilities with normal distributions • A popular generative model – Performance competitive to most of state-of-the-art classifiers even in presence of violating independence assumption – Many successful applications, e.g., spam mail filtering – Apart from classification, naïve Bayes can do more…