SlideShare a Scribd company logo
1 of 24
Pattern Recognition 41 (2008) 1824-1833

   Presenter: Chia-Ming Wang
Goal and Contribution
Goal and Contribution
• Construct a fuzzy classifier (Goal)
Goal and Contribution
• Construct a fuzzy classifier (Goal)
 • map the attrs. to predefined fuzzy sets
Goal and Contribution
• Construct a fuzzy classifier (Goal)
 • map the attrs. to predefined fuzzy sets
 • rules with conf. and target label (How?)
Goal and Contribution
• Construct a fuzzy classifier (Goal)
 • map the attrs. to predefined fuzzy sets
 • rules with conf. and target label (How?)
 • combination optimization problem (6 ) n
Goal and Contribution
• Construct a fuzzy classifier (Goal)
 • map the attrs. to predefined fuzzy sets
 • rules with conf. and target label (How?)
 • combination optimization problem (6 ) n


• Use SA to find a set of fuzzy rules (Contribution)
Goal and Contribution
• Construct a fuzzy classifier (Goal)
 • map the attrs. to predefined fuzzy sets
 • rules with conf. and target label (How?)
 • combination optimization problem (6 ) n


• Use SA to find a set of fuzzy rules (Contribution)
 • authors said
Goal and Contribution
• Construct a fuzzy classifier (Goal)
 • map the attrs. to predefined fuzzy sets
 • rules with conf. and target label (How?)
 • combination optimization problem (6 ) n


• Use SA to find a set of fuzzy rules (Contribution)
 • authors said
The used antecedent
                        fuzzy sets
             1.0
Membership




                                               1.small
                                               2.medium small
                   S   MS    M     ML    L
                                               3.medium
                                               4.medium large
                       Attribute Value
             0.0                         1.0   5.large
                                               6.don’t care
Membership




             1.0                               if x1 is small and
                                               x2 is medium and
                            DC
                                               x3 is don’t care
                                               Encode: 136
                       Attribute Value
             0.0                         1.0
Determination of Cj and CFj
1. calculate the compatibility of each training pattern xp with the rule Rj
                        µ j (x p ) = µ j1 (x p1 ) ×L × µ jn (x pn ),                  p = 1,K , m
2. for each class, calculate the relative sum of compatibility grades
of the training patterns in class h with the rule Rj
                                                 ∑
                        βClass h (R j ) =                    µ j (x p ) / N Class h
                                              x p ∈Class h


3. Find class hj hat
                            {                                             }     if 0 or conflict, set Cj be φ
   βClass h j (R j ) = max βClass1 (R j ),L , βClass c (R j )
          )



4. if Cj = φ, set CFj of rule Rj to 0. Otherwise
                                         c

           (                        )                               where β = ∑ βClass h (R j ) / (c − 1)
   CFj = βClass h (R j ) − β / ∑ βClass h (R j )
                )
                                                                               )
                    j
                                        h=1                                               h≠ h j
5. classify the sample xp with rule set S
                                {                                     }       reject if            µ j (x p ) = 0 ∀R j ∈S
   µ j* (x p ) ⋅ CFj* = max µ j (x p ) ⋅ CFj R j ∈S
Structure of the goal classifier
Structure of the goal classifier
                Classifier #1


                Set of rules
                for class #1


                Classifier #2


                Set of rules
                                Decision
                for class #2                Detected
 Test
                     .           Fusion      Class
Dataset
                     .
                     .
                Classifier #c


                Set of rules
                for class #c
Procedure of SAFCS
T = Tmax
Scurrent = Sinit
Sbest = Scurrent
EFcurrent = NNCP(Scurrent)
EFbest = NNCP(Sbest)
While (T ≥ Tmin)
     For i = 1 to k
         Call Metropolis(Scurrent, EFcurrent, Sbest, EFbest,T)
     Time = Time + k
     k=β×k
     T = α ×T
Return(Sbest)
Procedure of SAFCS
                                                      Mb
                                                 1                 −ΔEFb
T = Tmax                  # M = Mg + Mb                ∑ ΔEFb Tmax
                                          ΔEFb =                 =
                                                 Mb                ln(Pinit )
Scurrent = Sinit         # Ninit                        i=1


Sbest = Scurrent                                       N
EFcurrent = NNCP(Scurrent) #NNCP(S)= m− ∑ NCP(Rj )
EFbest = NNCP(Sbest)                                   j=1

While (T ≥ Tmin)         # Tmin = 0.01
     For i = 1 to k      # k is num of calls of metropolis
         Call Metropolis(Scurrent, EFcurrent, Sbest, EFbest,T)
     Time = Time + k # Time is the spend time so far
     k=β×k               # β is a constant (set to 1)
     T = α ×T            # α is the cooling rate (set to 0.9)
Return(Sbest)
Metropolis Procedure
Snew = Perturb (Scurrent)              # generate new S
EFnew = NNCP(Snew)
ΔEF = EFnew - EFcurrent
IF (ΔEF < 0), Then                     # better rule set
     Scurrent = Snew
     IF EFnew < EFbest , Then          # better than best
        Sbest = Snew

ELSEIF (rand(0,1) < exp(-ΔEF/T)), Then # accept, too
    Scurrent = Snew
Perturbation(3 func.)
1. Modify
  •  select a rule from S randomly
  •  modified one or more antecedent of it
  •  if the consequent is equal, then replace;
     otherwise, repeated
2. Delete                     f itnessmax (SClass h ) − f itness(R)
                 P (R) =
   select with,          f itnessmax (SClass h ) − f itnessmin (SClass h )

3. Create
   the same as modify, but add
   (NB:change more linguistic values than“Modify”,
   they said for jump)
Experiments
Parameters
 Parameters                            Values
 Initial set of rule size (Ninit)       50
 Initial temperature (Tmax)             100
 Final temperature (Tmin)              0.01
 Cooling rate (α)                      0.90
 # Iteration at each temperature (k)    40
 Iteration increment rate (β)            1

Estimate: 88 × 40 = 3520 iterations (keep in mind)
Competes
• C4.5:
• IBk: nearest neighbor, k = 3
• Naive Bayes
• LIBSVM
• XCS: Michigan approach
• GAssist: Pittsburgh approach
Dataset (UCI)
                                                                                                                                                         1829
                                               H. Mohamadi et al. / Pattern Recognition 41 (2008) 1824 – 1833

Table 1
Features of the data sets used in computational experiments

Name          #Instance          #Attribute        #Real.         #Nominal           #Class         Dev. cla. (%)          Mag. cla. (%)         Min. cla. (%)

bswd           625                4                 4              –                 3              18.03                  46.08                  7.84
cra            690               15                 6              9                 2               5.51                  55.51                 44.49
ion            351               34                34              –                 2              14.10                  64.10                 35.90
iris           150                4                 4              –                 3               –                     33.33                 33.33
lab             57               16                 8              8                 2              14.91                  64.91                 35.09
pima           768                8                 8              –                 2              15.10                  65.10                 34.90
wave          5000               40                40              –                 3               0.36                  33.84                 33.06
wine           178               13                13              –                 3               5.28                  39.89                 26.97

Dev.cla., deviation of class distribution; Mag. Cla, percentage of majority class instances; Min. Cla, percentage of minority class instances.




                                                                                     IBk [31] is the nearest neighbor classifier technique. It uses
Table 2
Parameters specification in computer simulations for the SAFCS                     the whole training set as the core of the classifier and Euclidean
                                                                                  distance to select the k nearest instances. The class prediction
Parameter                                                              Value

                                              10-fold cross validation            provided by the system is the majority class in these k examples.
Initial set of rules size (Ninit )                                     50
                                                                                  Here, k is set equal to 3.
Initial temperature (Tmax )                                            100
                                                                                     Naïve Bayes [32] is a very simple Bayesian network approach
Final temperature (Tmin )                                              0.01
                                                                                  that assumes that the predictive attributes are conditionally
Cooling rate ( )                                                       0.90
                                                                                  independent given the class and also that no hidden or latent
# Iteration at each temperature (k)                                    40
Iteration increment rate ( )                                           1          attributes influence the prediction process. These assumptions
Progress 1/2
Progress 2/2
Accuracies
                                                                                                                                                      1831
                                              H. Mohamadi et al. / Pattern Recognition 41 (2008) 1824 – 1833

Table 3
Train set and test set accuracies of different algorithms on eight UCI data sets (mean ± standard deviation)

Data set    Algorithm                C4.5               IBk              Naïve Bayes       SVM                 GAssist        XCS             SAFCS

                                                                                                                              95.19 ± 1.28
                                     89.93 ± 0.68       90.53 ± 0.54     91.92 ± 0.25      91.01 ± 0.19        92.14 ± 0.28                   94.63 ± 0.46
bswd        Train set accuracy %
                                                                         91.43 ± 1.25
                                     77.66 ± 2.91       86.09 ± 2.72                       90.90 ± 1.43        89.62 ± 2.22   81.10 ± 3.80    90.47 ± 1.36
            Test set accuracy %

                                                                                                                              98.90 ± 0.73
                                     90.31 ± 0.86       91.05 ± 0.52     82.58 ± 0.82      55.51 ± 0.08        91.07 ± 0.73                   94.25 ± 0.54
cra         Train set accuracy %
                                                                                                                                              85.77 ± 3.27
                                     85.55 ± 3.45       84.73 ± 4.04     81.07 ± 5.32      55.51 ± 0.70        85.62 ± 4.00   85.60 ± 3.5
            Test set accuracy %

                                                                                                                              99.86 ± 0.24
                                     98.68 ± 0.54       90.94 ± 0.59     93.00 ± 0.42      94.19 ± 0.64        96.90 ± 0.74                   99.66 ± 0.34
ion         Train set accuracy %
                                                                                                               92.71 ± 5.01
                                     88.97 ± 5.91       85.66 ± 4.66     91.50 ± 4.70      92.14 ± 4.62                       90.10 ± 4.70    91.89 ± 4.65
            Test set accuracy %

                                                                                                                                              99.85 ± 0.19
                                     98.00 ± 0.61       96.59 ± 0.49     96.67 ± 0.53      97.11 ± 0.64        98.33 ± 0.79   99.10 ± 1.19
iris        Train set accuracy %
                                                                                                                                              96.66 ± 3.09
                                     94.22 ± 5.37       94.89 ± 6.37     96.22 ± 5.36      96.22 ± 4.77        95.20 ± 5.87   94.70 ± 5.10
            Test set accuracy %

                                                                                                               100 ± 0.00
                                     91.58 ± 4.00       98.77 ± 1.55     95.92 ± 1.60      96.04 ± 0.93                       99.92 ± 0.24    99.96 ± 0.08
lab         Train set accuracy %
                                                                                                                                              97.83 ± 5.33
                                     80.31 ± 17.44      95.38 ± 7.75     93.76 ± 10.50     93.35 ± 8.32        97.77 ± 5.98   83.50 ± 14.80
            Test set accuracy %

                                                                                                                              98.90 ± 0.67
                                     84.43 ± 2.41       85.67 ± 0.65     77.07 ± 0.61      78.27 ± 0.53        83.11 ± 0.82                   87.55 ± 0.59
pima        Train set accuracy %
                                                                                           77.32 ± 4.70
                                     75.44 ± 4.79       74.52 ± 3.91     75.30 ± 4.45                          74.46 ± 5.19   72.40 ± 5.30    75.71 ± 4.41
            Test set accuracy %

                                     97.29 ± 0.61                        81.59 ± 0.21                          78.28 ± 0.60                   85.02 ± 0.18
wave        Train set accuracy %                        N/A                                N/A                                N/A
                                                                                                                                              80.00 ± 1.16
                                     75.93 ± 2.10                        79.89 ± 1.40                          76.01 ± 1.97
            Test set accuracy %                         N/A                                N/A                                N/A

                                                                                                               100 ± 0.00     100 ± 0.00
                                     98.86 ± 0.54       97.27 ± 0.53     98.67 ± 0.45      99.33 ± 0.32                                       99.98 ± 0.04
wine        Train set accuracy %
                                                                                           98.10 ± 3.40
                                     94.24 ± 6.44       96.61 ± 4.02     97.20 ± 3.43                          96.33 ± 4.13   95.60 ± 4.90    97.63 ± 3.02
            Test set accuracy %

The best values are in bold.



                                                 C4.5      IBk     NB       LIBSVM         Gassist     XCS         SAFCS

More Related Content

Viewers also liked

Viewers also liked (11)

wiskunde testje
wiskunde testjewiskunde testje
wiskunde testje
 
Esrb Instruction
Esrb InstructionEsrb Instruction
Esrb Instruction
 
Le Tour de France
Le Tour de France Le Tour de France
Le Tour de France
 
Organização e Agilidade, como funcionam?
Organização e Agilidade, como funcionam?Organização e Agilidade, como funcionam?
Organização e Agilidade, como funcionam?
 
Learning The Membership Function Contexts For Mining Fuzzy Association Rules ...
Learning The Membership Function Contexts For Mining Fuzzy Association Rules ...Learning The Membership Function Contexts For Mining Fuzzy Association Rules ...
Learning The Membership Function Contexts For Mining Fuzzy Association Rules ...
 
Dal Sequestro al Report
Dal Sequestro al ReportDal Sequestro al Report
Dal Sequestro al Report
 
Projet evaluation impact - solution du chapitre 3
Projet evaluation impact - solution du chapitre 3Projet evaluation impact - solution du chapitre 3
Projet evaluation impact - solution du chapitre 3
 
實用華語文數位教學資源
實用華語文數位教學資源實用華語文數位教學資源
實用華語文數位教學資源
 
Open Education: the MOOC Experience
Open Education: the MOOC ExperienceOpen Education: the MOOC Experience
Open Education: the MOOC Experience
 
Projet evaluation impact - solution du chapitre 1
Projet evaluation impact - solution du chapitre 1Projet evaluation impact - solution du chapitre 1
Projet evaluation impact - solution du chapitre 1
 
Projet evaluation impact - solution du chapitre 4
Projet evaluation impact - solution du chapitre 4Projet evaluation impact - solution du chapitre 4
Projet evaluation impact - solution du chapitre 4
 

Similar to Data Mining With A Simulated Annealing Based Fuzzy Classification System

TunUp final presentation
TunUp final presentationTunUp final presentation
TunUp final presentation
Gianmario Spacagna
 
DS Unit-1.pptx very easy to understand..
DS Unit-1.pptx very easy to understand..DS Unit-1.pptx very easy to understand..
DS Unit-1.pptx very easy to understand..
KarthikeyaLanka1
 
New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...
Alexander Litvinenko
 
Integral Calculus Anti Derivatives reviewer
Integral Calculus Anti Derivatives reviewerIntegral Calculus Anti Derivatives reviewer
Integral Calculus Anti Derivatives reviewer
JoshuaAgcopra
 
Introduction
IntroductionIntroduction
Introduction
butest
 

Similar to Data Mining With A Simulated Annealing Based Fuzzy Classification System (20)

TunUp final presentation
TunUp final presentationTunUp final presentation
TunUp final presentation
 
State Space C-Reductions @ ETAPS Workshop GRAPHITE 2013
State Space C-Reductions @ ETAPS Workshop GRAPHITE 2013State Space C-Reductions @ ETAPS Workshop GRAPHITE 2013
State Space C-Reductions @ ETAPS Workshop GRAPHITE 2013
 
Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics Tucker tensor analysis of Matern functions in spatial statistics
Tucker tensor analysis of Matern functions in spatial statistics
 
Introduction To Lisp
Introduction To LispIntroduction To Lisp
Introduction To Lisp
 
Identification of the Mathematical Models of Complex Relaxation Processes in ...
Identification of the Mathematical Models of Complex Relaxation Processes in ...Identification of the Mathematical Models of Complex Relaxation Processes in ...
Identification of the Mathematical Models of Complex Relaxation Processes in ...
 
2_EditDistance_Jan_08_2020.pptx
2_EditDistance_Jan_08_2020.pptx2_EditDistance_Jan_08_2020.pptx
2_EditDistance_Jan_08_2020.pptx
 
Spike sorting: What is it? Why do we need it? Where does it come from? How is...
Spike sorting: What is it? Why do we need it? Where does it come from? How is...Spike sorting: What is it? Why do we need it? Where does it come from? How is...
Spike sorting: What is it? Why do we need it? Where does it come from? How is...
 
Unit 3
Unit 3Unit 3
Unit 3
 
Unit 3
Unit 3Unit 3
Unit 3
 
Response Surface in Tensor Train format for Uncertainty Quantification
Response Surface in Tensor Train format for Uncertainty QuantificationResponse Surface in Tensor Train format for Uncertainty Quantification
Response Surface in Tensor Train format for Uncertainty Quantification
 
MLP輪読スパース8章 トレースノルム正則化
MLP輪読スパース8章 トレースノルム正則化MLP輪読スパース8章 トレースノルム正則化
MLP輪読スパース8章 トレースノルム正則化
 
Planning Under Uncertainty With Markov Decision Processes
Planning Under Uncertainty With Markov Decision ProcessesPlanning Under Uncertainty With Markov Decision Processes
Planning Under Uncertainty With Markov Decision Processes
 
DS Unit-1.pptx very easy to understand..
DS Unit-1.pptx very easy to understand..DS Unit-1.pptx very easy to understand..
DS Unit-1.pptx very easy to understand..
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Codes and Isogenies
Codes and IsogeniesCodes and Isogenies
Codes and Isogenies
 
Decomposition and Denoising for moment sequences using convex optimization
Decomposition and Denoising for moment sequences using convex optimizationDecomposition and Denoising for moment sequences using convex optimization
Decomposition and Denoising for moment sequences using convex optimization
 
R Workshop for Beginners
R Workshop for BeginnersR Workshop for Beginners
R Workshop for Beginners
 
New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...New data structures and algorithms for \\post-processing large data sets and ...
New data structures and algorithms for \\post-processing large data sets and ...
 
Integral Calculus Anti Derivatives reviewer
Integral Calculus Anti Derivatives reviewerIntegral Calculus Anti Derivatives reviewer
Integral Calculus Anti Derivatives reviewer
 
Introduction
IntroductionIntroduction
Introduction
 

More from Jamie (Taka) Wang

More from Jamie (Taka) Wang (20)

20200606_insight_Ignition
20200606_insight_Ignition20200606_insight_Ignition
20200606_insight_Ignition
 
20200727_Insight workstation
20200727_Insight workstation20200727_Insight workstation
20200727_Insight workstation
 
20200723_insight_release_plan
20200723_insight_release_plan20200723_insight_release_plan
20200723_insight_release_plan
 
20210105_量產技轉
20210105_量產技轉20210105_量產技轉
20210105_量產技轉
 
20200808自營電商平台策略討論
20200808自營電商平台策略討論20200808自營電商平台策略討論
20200808自營電商平台策略討論
 
20200427_hardware
20200427_hardware20200427_hardware
20200427_hardware
 
20200429_ec
20200429_ec20200429_ec
20200429_ec
 
20200607_insight_sync
20200607_insight_sync20200607_insight_sync
20200607_insight_sync
 
20220113_product_day
20220113_product_day20220113_product_day
20220113_product_day
 
20200429_software
20200429_software20200429_software
20200429_software
 
20200602_insight_business
20200602_insight_business20200602_insight_business
20200602_insight_business
 
20200408_gen11_sequence_diagram
20200408_gen11_sequence_diagram20200408_gen11_sequence_diagram
20200408_gen11_sequence_diagram
 
20190827_activity_diagram
20190827_activity_diagram20190827_activity_diagram
20190827_activity_diagram
 
20150722 - AGV
20150722 - AGV20150722 - AGV
20150722 - AGV
 
20161220 - microservice
20161220 - microservice20161220 - microservice
20161220 - microservice
 
20160217 - Overview of Vortex Intelligent Data Sharing Platform
20160217 - Overview of Vortex Intelligent Data Sharing Platform20160217 - Overview of Vortex Intelligent Data Sharing Platform
20160217 - Overview of Vortex Intelligent Data Sharing Platform
 
20151111 - IoT Sync Up
20151111 - IoT Sync Up20151111 - IoT Sync Up
20151111 - IoT Sync Up
 
20151207 - iot strategy
20151207 - iot strategy20151207 - iot strategy
20151207 - iot strategy
 
20141210 - Microservice Container
20141210 - Microservice Container20141210 - Microservice Container
20141210 - Microservice Container
 
20161027 - edge part2
20161027 - edge part220161027 - edge part2
20161027 - edge part2
 

Recently uploaded

1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
SoniaTolstoy
 

Recently uploaded (20)

Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 

Data Mining With A Simulated Annealing Based Fuzzy Classification System

  • 1. Pattern Recognition 41 (2008) 1824-1833 Presenter: Chia-Ming Wang
  • 3. Goal and Contribution • Construct a fuzzy classifier (Goal)
  • 4. Goal and Contribution • Construct a fuzzy classifier (Goal) • map the attrs. to predefined fuzzy sets
  • 5. Goal and Contribution • Construct a fuzzy classifier (Goal) • map the attrs. to predefined fuzzy sets • rules with conf. and target label (How?)
  • 6. Goal and Contribution • Construct a fuzzy classifier (Goal) • map the attrs. to predefined fuzzy sets • rules with conf. and target label (How?) • combination optimization problem (6 ) n
  • 7. Goal and Contribution • Construct a fuzzy classifier (Goal) • map the attrs. to predefined fuzzy sets • rules with conf. and target label (How?) • combination optimization problem (6 ) n • Use SA to find a set of fuzzy rules (Contribution)
  • 8. Goal and Contribution • Construct a fuzzy classifier (Goal) • map the attrs. to predefined fuzzy sets • rules with conf. and target label (How?) • combination optimization problem (6 ) n • Use SA to find a set of fuzzy rules (Contribution) • authors said
  • 9. Goal and Contribution • Construct a fuzzy classifier (Goal) • map the attrs. to predefined fuzzy sets • rules with conf. and target label (How?) • combination optimization problem (6 ) n • Use SA to find a set of fuzzy rules (Contribution) • authors said
  • 10. The used antecedent fuzzy sets 1.0 Membership 1.small 2.medium small S MS M ML L 3.medium 4.medium large Attribute Value 0.0 1.0 5.large 6.don’t care Membership 1.0 if x1 is small and x2 is medium and DC x3 is don’t care Encode: 136 Attribute Value 0.0 1.0
  • 11. Determination of Cj and CFj 1. calculate the compatibility of each training pattern xp with the rule Rj µ j (x p ) = µ j1 (x p1 ) ×L × µ jn (x pn ), p = 1,K , m 2. for each class, calculate the relative sum of compatibility grades of the training patterns in class h with the rule Rj ∑ βClass h (R j ) = µ j (x p ) / N Class h x p ∈Class h 3. Find class hj hat { } if 0 or conflict, set Cj be φ βClass h j (R j ) = max βClass1 (R j ),L , βClass c (R j ) ) 4. if Cj = φ, set CFj of rule Rj to 0. Otherwise c ( ) where β = ∑ βClass h (R j ) / (c − 1) CFj = βClass h (R j ) − β / ∑ βClass h (R j ) ) ) j h=1 h≠ h j 5. classify the sample xp with rule set S { } reject if µ j (x p ) = 0 ∀R j ∈S µ j* (x p ) ⋅ CFj* = max µ j (x p ) ⋅ CFj R j ∈S
  • 12. Structure of the goal classifier
  • 13. Structure of the goal classifier Classifier #1 Set of rules for class #1 Classifier #2 Set of rules Decision for class #2 Detected Test . Fusion Class Dataset . . Classifier #c Set of rules for class #c
  • 14. Procedure of SAFCS T = Tmax Scurrent = Sinit Sbest = Scurrent EFcurrent = NNCP(Scurrent) EFbest = NNCP(Sbest) While (T ≥ Tmin) For i = 1 to k Call Metropolis(Scurrent, EFcurrent, Sbest, EFbest,T) Time = Time + k k=β×k T = α ×T Return(Sbest)
  • 15. Procedure of SAFCS Mb 1 −ΔEFb T = Tmax # M = Mg + Mb ∑ ΔEFb Tmax ΔEFb = = Mb ln(Pinit ) Scurrent = Sinit # Ninit i=1 Sbest = Scurrent N EFcurrent = NNCP(Scurrent) #NNCP(S)= m− ∑ NCP(Rj ) EFbest = NNCP(Sbest) j=1 While (T ≥ Tmin) # Tmin = 0.01 For i = 1 to k # k is num of calls of metropolis Call Metropolis(Scurrent, EFcurrent, Sbest, EFbest,T) Time = Time + k # Time is the spend time so far k=β×k # β is a constant (set to 1) T = α ×T # α is the cooling rate (set to 0.9) Return(Sbest)
  • 16. Metropolis Procedure Snew = Perturb (Scurrent) # generate new S EFnew = NNCP(Snew) ΔEF = EFnew - EFcurrent IF (ΔEF < 0), Then # better rule set Scurrent = Snew IF EFnew < EFbest , Then # better than best Sbest = Snew ELSEIF (rand(0,1) < exp(-ΔEF/T)), Then # accept, too Scurrent = Snew
  • 17. Perturbation(3 func.) 1. Modify • select a rule from S randomly • modified one or more antecedent of it • if the consequent is equal, then replace; otherwise, repeated 2. Delete f itnessmax (SClass h ) − f itness(R) P (R) = select with, f itnessmax (SClass h ) − f itnessmin (SClass h ) 3. Create the same as modify, but add (NB:change more linguistic values than“Modify”, they said for jump)
  • 19. Parameters Parameters Values Initial set of rule size (Ninit) 50 Initial temperature (Tmax) 100 Final temperature (Tmin) 0.01 Cooling rate (α) 0.90 # Iteration at each temperature (k) 40 Iteration increment rate (β) 1 Estimate: 88 × 40 = 3520 iterations (keep in mind)
  • 20. Competes • C4.5: • IBk: nearest neighbor, k = 3 • Naive Bayes • LIBSVM • XCS: Michigan approach • GAssist: Pittsburgh approach
  • 21. Dataset (UCI) 1829 H. Mohamadi et al. / Pattern Recognition 41 (2008) 1824 – 1833 Table 1 Features of the data sets used in computational experiments Name #Instance #Attribute #Real. #Nominal #Class Dev. cla. (%) Mag. cla. (%) Min. cla. (%) bswd 625 4 4 – 3 18.03 46.08 7.84 cra 690 15 6 9 2 5.51 55.51 44.49 ion 351 34 34 – 2 14.10 64.10 35.90 iris 150 4 4 – 3 – 33.33 33.33 lab 57 16 8 8 2 14.91 64.91 35.09 pima 768 8 8 – 2 15.10 65.10 34.90 wave 5000 40 40 – 3 0.36 33.84 33.06 wine 178 13 13 – 3 5.28 39.89 26.97 Dev.cla., deviation of class distribution; Mag. Cla, percentage of majority class instances; Min. Cla, percentage of minority class instances. IBk [31] is the nearest neighbor classifier technique. It uses Table 2 Parameters specification in computer simulations for the SAFCS the whole training set as the core of the classifier and Euclidean distance to select the k nearest instances. The class prediction Parameter Value 10-fold cross validation provided by the system is the majority class in these k examples. Initial set of rules size (Ninit ) 50 Here, k is set equal to 3. Initial temperature (Tmax ) 100 Naïve Bayes [32] is a very simple Bayesian network approach Final temperature (Tmin ) 0.01 that assumes that the predictive attributes are conditionally Cooling rate ( ) 0.90 independent given the class and also that no hidden or latent # Iteration at each temperature (k) 40 Iteration increment rate ( ) 1 attributes influence the prediction process. These assumptions
  • 24. Accuracies 1831 H. Mohamadi et al. / Pattern Recognition 41 (2008) 1824 – 1833 Table 3 Train set and test set accuracies of different algorithms on eight UCI data sets (mean ± standard deviation) Data set Algorithm C4.5 IBk Naïve Bayes SVM GAssist XCS SAFCS 95.19 ± 1.28 89.93 ± 0.68 90.53 ± 0.54 91.92 ± 0.25 91.01 ± 0.19 92.14 ± 0.28 94.63 ± 0.46 bswd Train set accuracy % 91.43 ± 1.25 77.66 ± 2.91 86.09 ± 2.72 90.90 ± 1.43 89.62 ± 2.22 81.10 ± 3.80 90.47 ± 1.36 Test set accuracy % 98.90 ± 0.73 90.31 ± 0.86 91.05 ± 0.52 82.58 ± 0.82 55.51 ± 0.08 91.07 ± 0.73 94.25 ± 0.54 cra Train set accuracy % 85.77 ± 3.27 85.55 ± 3.45 84.73 ± 4.04 81.07 ± 5.32 55.51 ± 0.70 85.62 ± 4.00 85.60 ± 3.5 Test set accuracy % 99.86 ± 0.24 98.68 ± 0.54 90.94 ± 0.59 93.00 ± 0.42 94.19 ± 0.64 96.90 ± 0.74 99.66 ± 0.34 ion Train set accuracy % 92.71 ± 5.01 88.97 ± 5.91 85.66 ± 4.66 91.50 ± 4.70 92.14 ± 4.62 90.10 ± 4.70 91.89 ± 4.65 Test set accuracy % 99.85 ± 0.19 98.00 ± 0.61 96.59 ± 0.49 96.67 ± 0.53 97.11 ± 0.64 98.33 ± 0.79 99.10 ± 1.19 iris Train set accuracy % 96.66 ± 3.09 94.22 ± 5.37 94.89 ± 6.37 96.22 ± 5.36 96.22 ± 4.77 95.20 ± 5.87 94.70 ± 5.10 Test set accuracy % 100 ± 0.00 91.58 ± 4.00 98.77 ± 1.55 95.92 ± 1.60 96.04 ± 0.93 99.92 ± 0.24 99.96 ± 0.08 lab Train set accuracy % 97.83 ± 5.33 80.31 ± 17.44 95.38 ± 7.75 93.76 ± 10.50 93.35 ± 8.32 97.77 ± 5.98 83.50 ± 14.80 Test set accuracy % 98.90 ± 0.67 84.43 ± 2.41 85.67 ± 0.65 77.07 ± 0.61 78.27 ± 0.53 83.11 ± 0.82 87.55 ± 0.59 pima Train set accuracy % 77.32 ± 4.70 75.44 ± 4.79 74.52 ± 3.91 75.30 ± 4.45 74.46 ± 5.19 72.40 ± 5.30 75.71 ± 4.41 Test set accuracy % 97.29 ± 0.61 81.59 ± 0.21 78.28 ± 0.60 85.02 ± 0.18 wave Train set accuracy % N/A N/A N/A 80.00 ± 1.16 75.93 ± 2.10 79.89 ± 1.40 76.01 ± 1.97 Test set accuracy % N/A N/A N/A 100 ± 0.00 100 ± 0.00 98.86 ± 0.54 97.27 ± 0.53 98.67 ± 0.45 99.33 ± 0.32 99.98 ± 0.04 wine Train set accuracy % 98.10 ± 3.40 94.24 ± 6.44 96.61 ± 4.02 97.20 ± 3.43 96.33 ± 4.13 95.60 ± 4.90 97.63 ± 3.02 Test set accuracy % The best values are in bold. C4.5 IBk NB LIBSVM Gassist XCS SAFCS

Editor's Notes