SlideShare uma empresa Scribd logo
1 de 3
Baixar para ler offline
Implementing Minimum Error Rate Classifier
Dipesh Shome
Department of Computer Science and Engineering,AUST
Ahsanullah University of Science and Technology
Dhaka, Bangladesh
160204045@aust.edu
Abstract—In this experiment, I tried to implement Minimum
error rate classifier using the posterior probabilities which
uses Normal distribution to calculate likelihood probabilities to
classify given sample points.The objective of minimum error rate
classifier is to minimize the error rate during classification. The
implementation is followed by some steps: calculate likelihood
using normal distribution then make decision rule to classify
sample points and then draw decision boundary. This classifier
is also known as Byes classifier with minimum error rate.
Index Terms—minimum error rate classifier, Bayesian classi-
fier. posterior probabilities, normal distribution, decision bound-
ary.
I. INTRODUCTION
The minimum error rate classifier seeks a decision rule
that minimizes the probability of error which is the error
rate. This classifier takes decision based on the most posterior
probabilities. In this experiment, a test sample points have been
given to classify . The likelihood probabilities of a sample is
given by the normal distribution. Normal distribution can be
define with two parameters: sigma and mean which have been
given. The details work procedures has been explained in the
methodology section.
As Bayesian classifier works with posterior probabilities
the decision rule is as follows:
if P(w1|x) > P(w2|x) Then x ∈ w1
if P(w1|x) < P(w2|x) Then x ∈ w2
The posterior probabilities can be calculated with the help
of likelihood probabilities.
P(wi|x) = P(x|wi)P(wi)
Ln(P(wi|x)) = Ln(P(x|wi)P(wi))
= LnP(x|wi) + LnP(wi)
As our data points are 2D so we have to use multivari-
ate normal distribution. For 2D data the normal distribution
formula is:
gi(x) = wT
x + w0
gi(x) = LnP(x|wi) + LnP(wi)
= −
D
2
Ln2π−
1
2
Ln|Σ|−
1
2
(x−µi)T
Σ−1
(x − µi)+LnP(wi)
II. EXPERIMENTAL DESIGN / METHODOLOGY
A. Description of the different tasks:
A set of 2D sample points named test.txt has been given.
Task 1: Classify the sample points from “test.txt”.
Task 2: Classified samples should have different colored
markers according to the assigned class label.
Task 3: Draw a figure which should include these points,
the corresponding probability distribution function along with
its contour.
Task 4: Draw decision boundary.
Given Normal Distribution Formula:
Nk(xi|µk, Σk) =
1
p
(2π)D|Σk|
e(− 1
2 (xi−µk)T
Σ−1
(xi−µk))
B. Implementation:
1) Plotting of classified sample data with different marker:
Using normalized formula of normal distribution i calculated
the value of g(x) for each data points with the two given
Normal distribution and check for the following condition
g1(x) > g2(x). If the condition is true then the sample
point x belongs to the corresponding regions of the normal
distributions.
The value of g1(x) greater than g2(x) means the sample
point likelihood probabilities close to the used normal
distribution so we can assign this sample point to that region
with a specific color. The output is in Fig 1.
Fig. 1. Classified Sample Point Plotting
2) Decision Boundary drawing: To draw the decision
boundary we have to obtain the equation of the decision
boundary and we know that the decision boundary comes from
g1(x)−g2(x) = 0. Using defined function multivariate normal
we calculate the value of decision boundary and then plot
the decision boundary.For better visualization i rotate the 3D
graph. Fig 2. In this we also plot the surface plot and contour
plot using multivariate normal distribution.
Fig. 2. Decision Boundary
III. RESULT ANALYSIS
Using the given normal distribution formula, i classified the
test.txt sample points and found that 3 data points classified
into class 1 and 3 data points classified into class 2. From the
3D graph we can also see that the distribution is parameterized
by mu and sigma. The upper plot is surface plot and the lower
plot is contour plot with decision boundary.
IV. CONCLUSION
In this experiment we came to know that how a minimum
error rate classifier works and what does it mean by sigma
and mean of Normal distribution. However there are some
limitations of this classifier. This classifier fully depends on
probability and the Normal distribution should be known.
V. ALGORITHM IMPLEMENTATION / CODE
1 import pandas as pd
2 import numpy as np
3 import random
4 import matplotlib.pyplot as plt
5
6 p_train = pd.read_csv(’assignment3.txt’, header=None
, sep=’,’, dtype=’float64’)
7 p_train=np.array(p_train)
8 p_train
9
10 sigma1 = np.array([[.25,.3],[.3,1]])
11 sigma2 = np.array([[.5,0],[0,.5]])
12 mu1 = np.array([0,0])
13 mu2 = np.array([2,2])
14
15 pw1 = 0.5
16 pw2 = 0.5
17
18 x=len(p_train)
19 print(x)
20
21 classs = []
22 for i in range(x):
23 g1 = -0.5*(np.dot(np.dot(p_train[i]-mu1,np.
linalg.inv(sigma1)), (p_train[i]-mu1).T)) - np.
log(2*np.pi) - (0.5 * np.log(np.linalg.det(
sigma1)))+pw1
24 g2 = -0.5*(np.dot(np.dot(p_train[i]-mu2,np.
linalg.inv(sigma2)), (p_train[i]-mu2).T)) - np.
log(2*np.pi) - (0.5 * np.log(np.linalg.det(
sigma2)))+pw2
25 if g1>g2:
26 temp = []
27 temp.append(p_train[i][0])
28 temp.append(p_train[i][1])
29 temp.append(1)
30 classs.append(temp)
31 else:
32 temp = []
33 temp.append(p_train[i][0])
34 temp.append(p_train[i][1])
35 temp.append(2)
36 classs.append(temp)
37
38 x1=[]
39 y1=[]
40 x2=[]
41 y2=[]
42
43 for i in range(x):
44 if classs[i][2]==1:
45 x1.append(classs[i][0])
46 y1.append(classs[i][1])
47
48 else:
49 x2.append(classs[i][0])
50 y2.append(classs[i][1])
51
52
53 print(x1,y1)
54 print(x2,y2)
55
56
57 import numpy as np
58 import matplotlib.pyplot as plt
59 from matplotlib import cm
60 from mpl_toolkits.mplot3d import Axes3D
61
62
63 N = 60
64 X = np.linspace(-6, 6, N)
65 Y = np.linspace(-6, 6, N)
66 X, Y = np.meshgrid(X, Y)
67
68
69 pos = np.empty(X.shape + (2,))
70 pos[:, :, 0] = X
71 pos[:, :, 1] = Y
72
73 def multivariate_normal(pos, mu, Sigma):
74
75
76 n = mu.shape[0]
77 Sigma_det = np.linalg.det(Sigma)
78 Sigma_inv = np.linalg.inv(Sigma)
79 N = np.sqrt((2*np.pi)**n * Sigma_det)
80 fac = np.einsum(’...k,kl,...l->...’, pos-mu,
Sigma_inv, pos-mu)
81
82 return np.exp(-fac / 2) / N
83
84 Z1 = multivariate_normal(pos, mu1, sigma1)
85 Z2 = multivariate_normal(pos, mu2, sigma2)
86 db = Z1 - Z2
87
88
89
90 fig = plt.figure(figsize=(15,10))
91 ax = fig.gca(projection=’3d’)
92
93 ax.plot_surface(X, Y, Z1, rstride=3, cstride=3,
linewidth=0,alpha = 0.4, antialiased=True,cmap=’
viridis’)
94 ax.plot_surface(X, Y, Z2, rstride=3, cstride=3,
linewidth=0,alpha = 0.4, antialiased=True,cmap=’
viridis’)
95
96 cset1 = ax.contourf(X, Y, Z1, zdir=’z’, offset=-0.5,
alpha = 0.3, cmap=’viridis’)
97 cset2 = ax.contourf(X, Y, Z2, zdir=’z’, offset=-0.5,
alpha = 0.3, cmap= ’viridis’)
98
99 db2 = ax.contour(X, Y, db, zdir=’z’, offset=-0.5,
alpha = 1, cmap=’Greens’)
100
101 # Adjust the limits, ticks and view angle
102 ax.set_zlim(-0.5,0.4)
103 ax.set_zticks(np.linspace(0,0.3,7))
104 ax.view_init(30, -110)
105 ax.scatter(x1,y1,color=’r’,marker=’.’,alpha=0.8, s
=40, label=’Train class 1’)
106 ax.scatter(x2,y2,color=’b’,marker=’*’,alpha=0.8, s
=40, label=’Train class 2’)
107
108 ax.legend()
109 plt.show()
REFERENCES
[1] Data Visualization: Visualizing the bivariate Gaussian distribution
[2] G. Eason, B. Noble, and I. N. Sneddon, “On certain integrals of
Lipschitz-Hankel type involving products of Bessel functions,” Phil.
Trans. Roy. Soc. London, vol. A247, pp. 529–551, April 1955.

Mais conteúdo relacionado

Mais procurados

Mais procurados (20)

Csc446: Pattern Recognition
Csc446: Pattern Recognition Csc446: Pattern Recognition
Csc446: Pattern Recognition
 
Edge linking in image processing
Edge linking in image processingEdge linking in image processing
Edge linking in image processing
 
SPATIAL FILTER
SPATIAL FILTERSPATIAL FILTER
SPATIAL FILTER
 
lazy learners and other classication methods
lazy learners and other classication methodslazy learners and other classication methods
lazy learners and other classication methods
 
Image feature extraction
Image feature extractionImage feature extraction
Image feature extraction
 
NP completeness
NP completenessNP completeness
NP completeness
 
Edge Detection and Segmentation
Edge Detection and SegmentationEdge Detection and Segmentation
Edge Detection and Segmentation
 
Divide and Conquer
Divide and ConquerDivide and Conquer
Divide and Conquer
 
Image Smoothing using Frequency Domain Filters
Image Smoothing using Frequency Domain FiltersImage Smoothing using Frequency Domain Filters
Image Smoothing using Frequency Domain Filters
 
Z buffer
Z bufferZ buffer
Z buffer
 
Back face detection
Back face detectionBack face detection
Back face detection
 
Image segmentation
Image segmentation Image segmentation
Image segmentation
 
Smoothing Filters in Spatial Domain
Smoothing Filters in Spatial DomainSmoothing Filters in Spatial Domain
Smoothing Filters in Spatial Domain
 
Branch and bound
Branch and boundBranch and bound
Branch and bound
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image Segmentation
 
Image sampling and quantization
Image sampling and quantizationImage sampling and quantization
Image sampling and quantization
 
Solving recurrences
Solving recurrencesSolving recurrences
Solving recurrences
 
knowledge representation using rules
knowledge representation using rulesknowledge representation using rules
knowledge representation using rules
 
HOPFIELD NETWORK
HOPFIELD NETWORKHOPFIELD NETWORK
HOPFIELD NETWORK
 
Anti aliasing
Anti aliasingAnti aliasing
Anti aliasing
 

Semelhante a Implementing Minimum Error Rate Classifier

Designing a Minimum Distance classifier to Class Mean Classifier
Designing a Minimum Distance classifier to Class Mean ClassifierDesigning a Minimum Distance classifier to Class Mean Classifier
Designing a Minimum Distance classifier to Class Mean ClassifierDipesh Shome
 
Pattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifierPattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifierNayem Nayem
 
Shriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa NaikShriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa NaikShriram Nandakumar
 
20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdfMariaKhan905189
 
机器学习Adaboost
机器学习Adaboost机器学习Adaboost
机器学习AdaboostShocky1
 
CD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdf
CD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdfCD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdf
CD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdfRajJain516913
 
Analytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion miningAnalytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion miningcsandit
 
Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...
Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...
Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...cscpconf
 
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MININGANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MININGcsandit
 
Fuzzy image processing- fuzzy C-mean clustering
Fuzzy image processing- fuzzy C-mean clusteringFuzzy image processing- fuzzy C-mean clustering
Fuzzy image processing- fuzzy C-mean clusteringFarah M. Altufaili
 
Anomaly detection using deep one class classifier
Anomaly detection using deep one class classifierAnomaly detection using deep one class classifier
Anomaly detection using deep one class classifier홍배 김
 
iiit delhi unsupervised pdf.pdf
iiit delhi unsupervised pdf.pdfiiit delhi unsupervised pdf.pdf
iiit delhi unsupervised pdf.pdfVIKASGUPTA127897
 
A Mathematical Programming Approach for Selection of Variables in Cluster Ana...
A Mathematical Programming Approach for Selection of Variables in Cluster Ana...A Mathematical Programming Approach for Selection of Variables in Cluster Ana...
A Mathematical Programming Approach for Selection of Variables in Cluster Ana...IJRES Journal
 
2014-mo444-practical-assignment-04-paulo_faria
2014-mo444-practical-assignment-04-paulo_faria2014-mo444-practical-assignment-04-paulo_faria
2014-mo444-practical-assignment-04-paulo_fariaPaulo Faria
 
Fuzzy c means_realestate_application
Fuzzy c means_realestate_applicationFuzzy c means_realestate_application
Fuzzy c means_realestate_applicationCemal Ardil
 

Semelhante a Implementing Minimum Error Rate Classifier (20)

Designing a Minimum Distance classifier to Class Mean Classifier
Designing a Minimum Distance classifier to Class Mean ClassifierDesigning a Minimum Distance classifier to Class Mean Classifier
Designing a Minimum Distance classifier to Class Mean Classifier
 
Pattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifierPattern Recognition - Designing a minimum distance class mean classifier
Pattern Recognition - Designing a minimum distance class mean classifier
 
Shriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa NaikShriram Nandakumar & Deepa Naik
Shriram Nandakumar & Deepa Naik
 
ICPR 2016
ICPR 2016ICPR 2016
ICPR 2016
 
20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf20MEMECH Part 3- Classification.pdf
20MEMECH Part 3- Classification.pdf
 
机器学习Adaboost
机器学习Adaboost机器学习Adaboost
机器学习Adaboost
 
CD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdf
CD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdfCD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdf
CD504 CGM_Lab Manual_004e08d3838702ed11fc6d03cc82f7be.pdf
 
Analytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion miningAnalytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion mining
 
Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...
Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...
Radial Basis Function Neural Network (RBFNN), Induction Motor, Vector control...
 
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MININGANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
 
Cgm Lab Manual
Cgm Lab ManualCgm Lab Manual
Cgm Lab Manual
 
Deepa seminar
Deepa seminarDeepa seminar
Deepa seminar
 
MSE.pptx
MSE.pptxMSE.pptx
MSE.pptx
 
Fuzzy image processing- fuzzy C-mean clustering
Fuzzy image processing- fuzzy C-mean clusteringFuzzy image processing- fuzzy C-mean clustering
Fuzzy image processing- fuzzy C-mean clustering
 
Anomaly detection using deep one class classifier
Anomaly detection using deep one class classifierAnomaly detection using deep one class classifier
Anomaly detection using deep one class classifier
 
iiit delhi unsupervised pdf.pdf
iiit delhi unsupervised pdf.pdfiiit delhi unsupervised pdf.pdf
iiit delhi unsupervised pdf.pdf
 
K mean-clustering
K mean-clusteringK mean-clustering
K mean-clustering
 
A Mathematical Programming Approach for Selection of Variables in Cluster Ana...
A Mathematical Programming Approach for Selection of Variables in Cluster Ana...A Mathematical Programming Approach for Selection of Variables in Cluster Ana...
A Mathematical Programming Approach for Selection of Variables in Cluster Ana...
 
2014-mo444-practical-assignment-04-paulo_faria
2014-mo444-practical-assignment-04-paulo_faria2014-mo444-practical-assignment-04-paulo_faria
2014-mo444-practical-assignment-04-paulo_faria
 
Fuzzy c means_realestate_application
Fuzzy c means_realestate_applicationFuzzy c means_realestate_application
Fuzzy c means_realestate_application
 

Último

Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 

Último (20)

Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 

Implementing Minimum Error Rate Classifier

  • 1. Implementing Minimum Error Rate Classifier Dipesh Shome Department of Computer Science and Engineering,AUST Ahsanullah University of Science and Technology Dhaka, Bangladesh 160204045@aust.edu Abstract—In this experiment, I tried to implement Minimum error rate classifier using the posterior probabilities which uses Normal distribution to calculate likelihood probabilities to classify given sample points.The objective of minimum error rate classifier is to minimize the error rate during classification. The implementation is followed by some steps: calculate likelihood using normal distribution then make decision rule to classify sample points and then draw decision boundary. This classifier is also known as Byes classifier with minimum error rate. Index Terms—minimum error rate classifier, Bayesian classi- fier. posterior probabilities, normal distribution, decision bound- ary. I. INTRODUCTION The minimum error rate classifier seeks a decision rule that minimizes the probability of error which is the error rate. This classifier takes decision based on the most posterior probabilities. In this experiment, a test sample points have been given to classify . The likelihood probabilities of a sample is given by the normal distribution. Normal distribution can be define with two parameters: sigma and mean which have been given. The details work procedures has been explained in the methodology section. As Bayesian classifier works with posterior probabilities the decision rule is as follows: if P(w1|x) > P(w2|x) Then x ∈ w1 if P(w1|x) < P(w2|x) Then x ∈ w2 The posterior probabilities can be calculated with the help of likelihood probabilities. P(wi|x) = P(x|wi)P(wi) Ln(P(wi|x)) = Ln(P(x|wi)P(wi)) = LnP(x|wi) + LnP(wi) As our data points are 2D so we have to use multivari- ate normal distribution. For 2D data the normal distribution formula is: gi(x) = wT x + w0 gi(x) = LnP(x|wi) + LnP(wi) = − D 2 Ln2π− 1 2 Ln|Σ|− 1 2 (x−µi)T Σ−1 (x − µi)+LnP(wi) II. EXPERIMENTAL DESIGN / METHODOLOGY A. Description of the different tasks: A set of 2D sample points named test.txt has been given. Task 1: Classify the sample points from “test.txt”. Task 2: Classified samples should have different colored markers according to the assigned class label. Task 3: Draw a figure which should include these points, the corresponding probability distribution function along with its contour. Task 4: Draw decision boundary. Given Normal Distribution Formula: Nk(xi|µk, Σk) = 1 p (2π)D|Σk| e(− 1 2 (xi−µk)T Σ−1 (xi−µk)) B. Implementation: 1) Plotting of classified sample data with different marker: Using normalized formula of normal distribution i calculated the value of g(x) for each data points with the two given Normal distribution and check for the following condition g1(x) > g2(x). If the condition is true then the sample point x belongs to the corresponding regions of the normal distributions. The value of g1(x) greater than g2(x) means the sample point likelihood probabilities close to the used normal distribution so we can assign this sample point to that region with a specific color. The output is in Fig 1.
  • 2. Fig. 1. Classified Sample Point Plotting 2) Decision Boundary drawing: To draw the decision boundary we have to obtain the equation of the decision boundary and we know that the decision boundary comes from g1(x)−g2(x) = 0. Using defined function multivariate normal we calculate the value of decision boundary and then plot the decision boundary.For better visualization i rotate the 3D graph. Fig 2. In this we also plot the surface plot and contour plot using multivariate normal distribution. Fig. 2. Decision Boundary III. RESULT ANALYSIS Using the given normal distribution formula, i classified the test.txt sample points and found that 3 data points classified into class 1 and 3 data points classified into class 2. From the 3D graph we can also see that the distribution is parameterized by mu and sigma. The upper plot is surface plot and the lower plot is contour plot with decision boundary. IV. CONCLUSION In this experiment we came to know that how a minimum error rate classifier works and what does it mean by sigma and mean of Normal distribution. However there are some limitations of this classifier. This classifier fully depends on probability and the Normal distribution should be known. V. ALGORITHM IMPLEMENTATION / CODE 1 import pandas as pd 2 import numpy as np 3 import random 4 import matplotlib.pyplot as plt 5 6 p_train = pd.read_csv(’assignment3.txt’, header=None , sep=’,’, dtype=’float64’) 7 p_train=np.array(p_train) 8 p_train 9 10 sigma1 = np.array([[.25,.3],[.3,1]]) 11 sigma2 = np.array([[.5,0],[0,.5]]) 12 mu1 = np.array([0,0]) 13 mu2 = np.array([2,2]) 14 15 pw1 = 0.5 16 pw2 = 0.5 17 18 x=len(p_train) 19 print(x) 20 21 classs = [] 22 for i in range(x): 23 g1 = -0.5*(np.dot(np.dot(p_train[i]-mu1,np. linalg.inv(sigma1)), (p_train[i]-mu1).T)) - np. log(2*np.pi) - (0.5 * np.log(np.linalg.det( sigma1)))+pw1 24 g2 = -0.5*(np.dot(np.dot(p_train[i]-mu2,np. linalg.inv(sigma2)), (p_train[i]-mu2).T)) - np. log(2*np.pi) - (0.5 * np.log(np.linalg.det( sigma2)))+pw2 25 if g1>g2: 26 temp = [] 27 temp.append(p_train[i][0]) 28 temp.append(p_train[i][1]) 29 temp.append(1) 30 classs.append(temp) 31 else: 32 temp = [] 33 temp.append(p_train[i][0]) 34 temp.append(p_train[i][1]) 35 temp.append(2) 36 classs.append(temp) 37 38 x1=[] 39 y1=[] 40 x2=[] 41 y2=[] 42 43 for i in range(x): 44 if classs[i][2]==1: 45 x1.append(classs[i][0]) 46 y1.append(classs[i][1]) 47 48 else: 49 x2.append(classs[i][0]) 50 y2.append(classs[i][1])
  • 3. 51 52 53 print(x1,y1) 54 print(x2,y2) 55 56 57 import numpy as np 58 import matplotlib.pyplot as plt 59 from matplotlib import cm 60 from mpl_toolkits.mplot3d import Axes3D 61 62 63 N = 60 64 X = np.linspace(-6, 6, N) 65 Y = np.linspace(-6, 6, N) 66 X, Y = np.meshgrid(X, Y) 67 68 69 pos = np.empty(X.shape + (2,)) 70 pos[:, :, 0] = X 71 pos[:, :, 1] = Y 72 73 def multivariate_normal(pos, mu, Sigma): 74 75 76 n = mu.shape[0] 77 Sigma_det = np.linalg.det(Sigma) 78 Sigma_inv = np.linalg.inv(Sigma) 79 N = np.sqrt((2*np.pi)**n * Sigma_det) 80 fac = np.einsum(’...k,kl,...l->...’, pos-mu, Sigma_inv, pos-mu) 81 82 return np.exp(-fac / 2) / N 83 84 Z1 = multivariate_normal(pos, mu1, sigma1) 85 Z2 = multivariate_normal(pos, mu2, sigma2) 86 db = Z1 - Z2 87 88 89 90 fig = plt.figure(figsize=(15,10)) 91 ax = fig.gca(projection=’3d’) 92 93 ax.plot_surface(X, Y, Z1, rstride=3, cstride=3, linewidth=0,alpha = 0.4, antialiased=True,cmap=’ viridis’) 94 ax.plot_surface(X, Y, Z2, rstride=3, cstride=3, linewidth=0,alpha = 0.4, antialiased=True,cmap=’ viridis’) 95 96 cset1 = ax.contourf(X, Y, Z1, zdir=’z’, offset=-0.5, alpha = 0.3, cmap=’viridis’) 97 cset2 = ax.contourf(X, Y, Z2, zdir=’z’, offset=-0.5, alpha = 0.3, cmap= ’viridis’) 98 99 db2 = ax.contour(X, Y, db, zdir=’z’, offset=-0.5, alpha = 1, cmap=’Greens’) 100 101 # Adjust the limits, ticks and view angle 102 ax.set_zlim(-0.5,0.4) 103 ax.set_zticks(np.linspace(0,0.3,7)) 104 ax.view_init(30, -110) 105 ax.scatter(x1,y1,color=’r’,marker=’.’,alpha=0.8, s =40, label=’Train class 1’) 106 ax.scatter(x2,y2,color=’b’,marker=’*’,alpha=0.8, s =40, label=’Train class 2’) 107 108 ax.legend() 109 plt.show() REFERENCES [1] Data Visualization: Visualizing the bivariate Gaussian distribution [2] G. Eason, B. Noble, and I. N. Sneddon, “On certain integrals of Lipschitz-Hankel type involving products of Bessel functions,” Phil. Trans. Roy. Soc. London, vol. A247, pp. 529–551, April 1955.