SlideShare uma empresa Scribd logo
1 de 6
Baixar para ler offline
Module
          12
Machine Learning
        Version 2 CSE IIT, Kharagpur
Lesson
              36
Rule Induction and
  Decision Tree - II
           Version 2 CSE IIT, Kharagpur
Splitting Functions
What attribute is the best to split the data? Let us remember some definitions from
information theory.

A measure of uncertainty or entropy that is associated to a random variable X is defined
as
      H(X) = - Σ pi log pi

where the logarithm is in base 2.

This is the “average amount of information or entropy of a finite complete probability
scheme”.

We will use a entropy based splitting function.

Consider the previous example:



                      t2

                      t3


                                          t1
                                                                   S
                      S

Size divides the sample in two.
S1 = { 6P, 0NP}
S2 = { 3P, 5NP}

H(S1) = 0
H(S2) = -(3/8)log2(3/8)
        -(5/8)log2(5/8)




                                                         Version 2 CSE IIT, Kharagpur
S1
                    t2
                                                                 S2
                    t3
                                                                 S3

                                             t1
humidity divides the sample in three.
S1 = { 2P, 2NP}
S2 = { 5P, 0NP}
S3 = { 2P, 3NP}

H(S1) = 1
H(S2) = 0
H(S3) = -(2/5)log2(2/5)
        -(3/5)log2(3/5)


Let us define information gain as follows:

Information gain IG over attribute A: IG (A)

IG(A) = H(S) - Σv (Sv/S) H (Sv)

H(S) is the entropy of all examples. H(Sv) is the entropy of one subsample after
partitioning S based on all possible values of attribute A.

Consider the previous example:


                             t2

                             t3


                                                  t1

                              S2                       S1

                                                            Version 2 CSE IIT, Kharagpur
We have,

H(S1) = 0
H(S2) = -(3/8)log2(3/8)
        -(5/8)log2(5/8)

H(S) = -(9/14)log2(9/14)
       -(5/14)log2(5/14)

|S1|/|S| = 6/14
|S2|/|S| = 8/14

The principle for decision tree construction may be stated as follows:

Order the splits (attribute and value of the attribute) in decreasing order of
information gain.

12.3.4 Decision Tree Pruning

Practical issues while building a decision tree can be enumerated as follows:

    1)   How deep should the tree be?
    2)   How do we handle continuous attributes?
    3)   What is a good splitting function?
    4)   What happens when attribute values are missing?
    5)   How do we improve the computational efficiency

The depth of the tree is related to the generalization capability of the tree. If not carefully
chosen it may lead to overfitting.

A tree overfits the data if we let it grow deep enough so that it begins to capture
“aberrations” in the data that harm the predictive power on unseen examples:



  t2
                                                     Possibly just noise, but
                                                     the tree is grown larger
  t3                                                 to capture these examples




There are two main solutions to overfitting in a decision tree:

    1) Stop the tree early before it begins to overfit the data.
                                                              Version 2 CSE IIT, Kharagpur
+ In practice this solution is hard to implement because it is not clear what is a good
stopping point.

2) Grow the tree until the algorithm stops even if the overfitting problem shows up.
Then prune the tree as a post-processing step.

       + This method has found great popularity in the machine learning community.


A common decision tree pruning algorithm is described below.

1. Consider all internal nodes in the tree.
2. For each node check if removing it (along with the subtree below it) and assigning the
   most common class to it does not harm accuracy on the validation set.
3. Pick the node n* that yields the best performance and prune its subtree.
4. Go back to (2) until no more improvements are possible.


Decision trees are appropriate for problems where:

   •   Attributes are both numeric and nominal.

   •    Target function takes on a discrete number of values.

   •    A DNF representation is effective in representing the target concept.

   •    Data may have errors.




                                                           Version 2 CSE IIT, Kharagpur

Mais conteúdo relacionado

Mais procurados

04 image transformations_ii
04 image transformations_ii04 image transformations_ii
04 image transformations_iiankit_ppt
 
PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...
PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...
PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...Jisang Yoon
 
PPT - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
PPT - AutoML-Zero: Evolving Machine Learning Algorithms From ScratchPPT - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
PPT - AutoML-Zero: Evolving Machine Learning Algorithms From ScratchJisang Yoon
 
03 image transformations_i
03 image transformations_i03 image transformations_i
03 image transformations_iankit_ppt
 
Rabbit challenge 3 DNN Day2
Rabbit challenge 3 DNN Day2Rabbit challenge 3 DNN Day2
Rabbit challenge 3 DNN Day2TOMMYLINK1
 
DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...
DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...
DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...cscpconf
 
Convolutional Neural Network (CNN) presentation from theory to code in Theano
Convolutional Neural Network (CNN) presentation from theory to code in TheanoConvolutional Neural Network (CNN) presentation from theory to code in Theano
Convolutional Neural Network (CNN) presentation from theory to code in TheanoSeongwon Hwang
 
08 neural networks
08 neural networks08 neural networks
08 neural networksankit_ppt
 
Multiclass Recognition with Multiple Feature Trees
Multiclass Recognition with Multiple Feature TreesMulticlass Recognition with Multiple Feature Trees
Multiclass Recognition with Multiple Feature Treescsandit
 
Rabbit challenge 5_dnn3
Rabbit challenge 5_dnn3Rabbit challenge 5_dnn3
Rabbit challenge 5_dnn3TOMMYLINK1
 
Analytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion miningAnalytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion miningcsandit
 
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MININGANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MININGcsandit
 
Foundations: Artificial Neural Networks
Foundations: Artificial Neural NetworksFoundations: Artificial Neural Networks
Foundations: Artificial Neural Networksananth
 
Kohonen self organizing maps
Kohonen self organizing mapsKohonen self organizing maps
Kohonen self organizing mapsraphaelkiminya
 
Section5 Rbf
Section5 RbfSection5 Rbf
Section5 Rbfkylin
 
Catching co occurrence information using word2vec-inspired matrix factorization
Catching co occurrence information using word2vec-inspired matrix factorizationCatching co occurrence information using word2vec-inspired matrix factorization
Catching co occurrence information using word2vec-inspired matrix factorizationhyunsung lee
 
Neural collaborative filtering-발표
Neural collaborative filtering-발표Neural collaborative filtering-발표
Neural collaborative filtering-발표hyunsung lee
 

Mais procurados (19)

04 image transformations_ii
04 image transformations_ii04 image transformations_ii
04 image transformations_ii
 
PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...
PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...
PPT - Adaptive Quantitative Trading : An Imitative Deep Reinforcement Learnin...
 
PPT - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
PPT - AutoML-Zero: Evolving Machine Learning Algorithms From ScratchPPT - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
PPT - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch
 
03 image transformations_i
03 image transformations_i03 image transformations_i
03 image transformations_i
 
Rabbit challenge 3 DNN Day2
Rabbit challenge 3 DNN Day2Rabbit challenge 3 DNN Day2
Rabbit challenge 3 DNN Day2
 
DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...
DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...
DESIGN AND IMPLEMENTATION OF BINARY NEURAL NETWORK LEARNING WITH FUZZY CLUSTE...
 
07 learning
07 learning07 learning
07 learning
 
Convolutional Neural Network (CNN) presentation from theory to code in Theano
Convolutional Neural Network (CNN) presentation from theory to code in TheanoConvolutional Neural Network (CNN) presentation from theory to code in Theano
Convolutional Neural Network (CNN) presentation from theory to code in Theano
 
08 neural networks
08 neural networks08 neural networks
08 neural networks
 
Multiclass Recognition with Multiple Feature Trees
Multiclass Recognition with Multiple Feature TreesMulticlass Recognition with Multiple Feature Trees
Multiclass Recognition with Multiple Feature Trees
 
Rabbit challenge 5_dnn3
Rabbit challenge 5_dnn3Rabbit challenge 5_dnn3
Rabbit challenge 5_dnn3
 
Analytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion miningAnalytical study of feature extraction techniques in opinion mining
Analytical study of feature extraction techniques in opinion mining
 
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MININGANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
ANALYTICAL STUDY OF FEATURE EXTRACTION TECHNIQUES IN OPINION MINING
 
Foundations: Artificial Neural Networks
Foundations: Artificial Neural NetworksFoundations: Artificial Neural Networks
Foundations: Artificial Neural Networks
 
Kohonen self organizing maps
Kohonen self organizing mapsKohonen self organizing maps
Kohonen self organizing maps
 
505 260-266
505 260-266505 260-266
505 260-266
 
Section5 Rbf
Section5 RbfSection5 Rbf
Section5 Rbf
 
Catching co occurrence information using word2vec-inspired matrix factorization
Catching co occurrence information using word2vec-inspired matrix factorizationCatching co occurrence information using word2vec-inspired matrix factorization
Catching co occurrence information using word2vec-inspired matrix factorization
 
Neural collaborative filtering-발표
Neural collaborative filtering-발표Neural collaborative filtering-발표
Neural collaborative filtering-발표
 

Destaque (7)

AI Lesson 39
AI Lesson 39AI Lesson 39
AI Lesson 39
 
AI Lesson 37
AI Lesson 37AI Lesson 37
AI Lesson 37
 
AI Lesson 38
AI Lesson 38AI Lesson 38
AI Lesson 38
 
AI Lesson 40
AI Lesson 40AI Lesson 40
AI Lesson 40
 
AI Lesson 03
AI Lesson 03AI Lesson 03
AI Lesson 03
 
AI Lesson 41
AI Lesson 41AI Lesson 41
AI Lesson 41
 
Ai ch2
Ai ch2Ai ch2
Ai ch2
 

Semelhante a AI Lesson 36

Practical Spherical Harmonics Based PRT Methods
Practical Spherical Harmonics Based PRT MethodsPractical Spherical Harmonics Based PRT Methods
Practical Spherical Harmonics Based PRT MethodsNaughty Dog
 
Data streaming algorithms
Data streaming algorithmsData streaming algorithms
Data streaming algorithmsSandeep Joshi
 
Tree representation in map reduce world
Tree representation  in map reduce worldTree representation  in map reduce world
Tree representation in map reduce worldYu Liu
 
presentation
presentationpresentation
presentationjie ren
 
Chap 8. Optimization for training deep models
Chap 8. Optimization for training deep modelsChap 8. Optimization for training deep models
Chap 8. Optimization for training deep modelsYoung-Geun Choi
 
Tree models with Scikit-Learn: Great models with little assumptions
Tree models with Scikit-Learn: Great models with little assumptionsTree models with Scikit-Learn: Great models with little assumptions
Tree models with Scikit-Learn: Great models with little assumptionsGilles Louppe
 
Machine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree LearningMachine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree Learningbutest
 
Measures of different reliability parameters for a complex redundant system u...
Measures of different reliability parameters for a complex redundant system u...Measures of different reliability parameters for a complex redundant system u...
Measures of different reliability parameters for a complex redundant system u...Alexander Decker
 
Introduction to Algorithms
Introduction to AlgorithmsIntroduction to Algorithms
Introduction to AlgorithmsVenkatesh Iyer
 
Jan vitek distributedrandomforest_5-2-2013
Jan vitek distributedrandomforest_5-2-2013Jan vitek distributedrandomforest_5-2-2013
Jan vitek distributedrandomforest_5-2-2013Sri Ambati
 
Business Math Chapter 2
Business Math Chapter 2Business Math Chapter 2
Business Math Chapter 2Nazrin Nazdri
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 
Paper computer
Paper computerPaper computer
Paper computerbikram ...
 
Conference poster 6
Conference poster 6Conference poster 6
Conference poster 6NTNU
 
Introduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptxIntroduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptxPJS KUMAR
 
Learning
LearningLearning
Learningbutest
 

Semelhante a AI Lesson 36 (20)

Decision tree learning
Decision tree learningDecision tree learning
Decision tree learning
 
Practical Spherical Harmonics Based PRT Methods
Practical Spherical Harmonics Based PRT MethodsPractical Spherical Harmonics Based PRT Methods
Practical Spherical Harmonics Based PRT Methods
 
Mathematical Modeling With Maple
Mathematical Modeling With MapleMathematical Modeling With Maple
Mathematical Modeling With Maple
 
Data streaming algorithms
Data streaming algorithmsData streaming algorithms
Data streaming algorithms
 
Tree representation in map reduce world
Tree representation  in map reduce worldTree representation  in map reduce world
Tree representation in map reduce world
 
presentation
presentationpresentation
presentation
 
Chap 8. Optimization for training deep models
Chap 8. Optimization for training deep modelsChap 8. Optimization for training deep models
Chap 8. Optimization for training deep models
 
Tree models with Scikit-Learn: Great models with little assumptions
Tree models with Scikit-Learn: Great models with little assumptionsTree models with Scikit-Learn: Great models with little assumptions
Tree models with Scikit-Learn: Great models with little assumptions
 
Machine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree LearningMachine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree Learning
 
Measures of different reliability parameters for a complex redundant system u...
Measures of different reliability parameters for a complex redundant system u...Measures of different reliability parameters for a complex redundant system u...
Measures of different reliability parameters for a complex redundant system u...
 
Introduction to Algorithms
Introduction to AlgorithmsIntroduction to Algorithms
Introduction to Algorithms
 
Jan vitek distributedrandomforest_5-2-2013
Jan vitek distributedrandomforest_5-2-2013Jan vitek distributedrandomforest_5-2-2013
Jan vitek distributedrandomforest_5-2-2013
 
Business Math Chapter 2
Business Math Chapter 2Business Math Chapter 2
Business Math Chapter 2
 
Paper computer
Paper computerPaper computer
Paper computer
 
Paper computer
Paper computerPaper computer
Paper computer
 
Conference poster 6
Conference poster 6Conference poster 6
Conference poster 6
 
MLE Example
MLE ExampleMLE Example
MLE Example
 
AlgorithmAnalysis2.ppt
AlgorithmAnalysis2.pptAlgorithmAnalysis2.ppt
AlgorithmAnalysis2.ppt
 
Introduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptxIntroduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptx
 
Learning
LearningLearning
Learning
 

Mais de Assistant Professor (20)

AI Lesson 35
AI Lesson 35AI Lesson 35
AI Lesson 35
 
AI Lesson 34
AI Lesson 34AI Lesson 34
AI Lesson 34
 
AI Lesson 33
AI Lesson 33AI Lesson 33
AI Lesson 33
 
AI Lesson 32
AI Lesson 32AI Lesson 32
AI Lesson 32
 
AI Lesson 31
AI Lesson 31AI Lesson 31
AI Lesson 31
 
AI Lesson 30
AI Lesson 30AI Lesson 30
AI Lesson 30
 
AI Lesson 29
AI Lesson 29AI Lesson 29
AI Lesson 29
 
AI Lesson 28
AI Lesson 28AI Lesson 28
AI Lesson 28
 
AI Lesson 27
AI Lesson 27AI Lesson 27
AI Lesson 27
 
AI Lesson 26
AI Lesson 26AI Lesson 26
AI Lesson 26
 
AI Lesson 25
AI Lesson 25AI Lesson 25
AI Lesson 25
 
AI Lesson 24
AI Lesson 24AI Lesson 24
AI Lesson 24
 
AI Lesson 23
AI Lesson 23AI Lesson 23
AI Lesson 23
 
AI Lesson 22
AI Lesson 22AI Lesson 22
AI Lesson 22
 
AI Lesson 21
AI Lesson 21AI Lesson 21
AI Lesson 21
 
Lesson 20
Lesson 20Lesson 20
Lesson 20
 
AI Lesson 19
AI Lesson 19AI Lesson 19
AI Lesson 19
 
AI Lesson 18
AI Lesson 18AI Lesson 18
AI Lesson 18
 
AI Lesson 17
AI Lesson 17AI Lesson 17
AI Lesson 17
 
AI Lesson 16
AI Lesson 16AI Lesson 16
AI Lesson 16
 

Último

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701bronxfugly43
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxVishalSingh1417
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesShubhangi Sonawane
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 

Último (20)

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 

AI Lesson 36

  • 1. Module 12 Machine Learning Version 2 CSE IIT, Kharagpur
  • 2. Lesson 36 Rule Induction and Decision Tree - II Version 2 CSE IIT, Kharagpur
  • 3. Splitting Functions What attribute is the best to split the data? Let us remember some definitions from information theory. A measure of uncertainty or entropy that is associated to a random variable X is defined as H(X) = - Σ pi log pi where the logarithm is in base 2. This is the “average amount of information or entropy of a finite complete probability scheme”. We will use a entropy based splitting function. Consider the previous example: t2 t3 t1 S S Size divides the sample in two. S1 = { 6P, 0NP} S2 = { 3P, 5NP} H(S1) = 0 H(S2) = -(3/8)log2(3/8) -(5/8)log2(5/8) Version 2 CSE IIT, Kharagpur
  • 4. S1 t2 S2 t3 S3 t1 humidity divides the sample in three. S1 = { 2P, 2NP} S2 = { 5P, 0NP} S3 = { 2P, 3NP} H(S1) = 1 H(S2) = 0 H(S3) = -(2/5)log2(2/5) -(3/5)log2(3/5) Let us define information gain as follows: Information gain IG over attribute A: IG (A) IG(A) = H(S) - Σv (Sv/S) H (Sv) H(S) is the entropy of all examples. H(Sv) is the entropy of one subsample after partitioning S based on all possible values of attribute A. Consider the previous example: t2 t3 t1 S2 S1 Version 2 CSE IIT, Kharagpur
  • 5. We have, H(S1) = 0 H(S2) = -(3/8)log2(3/8) -(5/8)log2(5/8) H(S) = -(9/14)log2(9/14) -(5/14)log2(5/14) |S1|/|S| = 6/14 |S2|/|S| = 8/14 The principle for decision tree construction may be stated as follows: Order the splits (attribute and value of the attribute) in decreasing order of information gain. 12.3.4 Decision Tree Pruning Practical issues while building a decision tree can be enumerated as follows: 1) How deep should the tree be? 2) How do we handle continuous attributes? 3) What is a good splitting function? 4) What happens when attribute values are missing? 5) How do we improve the computational efficiency The depth of the tree is related to the generalization capability of the tree. If not carefully chosen it may lead to overfitting. A tree overfits the data if we let it grow deep enough so that it begins to capture “aberrations” in the data that harm the predictive power on unseen examples: t2 Possibly just noise, but the tree is grown larger t3 to capture these examples There are two main solutions to overfitting in a decision tree: 1) Stop the tree early before it begins to overfit the data. Version 2 CSE IIT, Kharagpur
  • 6. + In practice this solution is hard to implement because it is not clear what is a good stopping point. 2) Grow the tree until the algorithm stops even if the overfitting problem shows up. Then prune the tree as a post-processing step. + This method has found great popularity in the machine learning community. A common decision tree pruning algorithm is described below. 1. Consider all internal nodes in the tree. 2. For each node check if removing it (along with the subtree below it) and assigning the most common class to it does not harm accuracy on the validation set. 3. Pick the node n* that yields the best performance and prune its subtree. 4. Go back to (2) until no more improvements are possible. Decision trees are appropriate for problems where: • Attributes are both numeric and nominal. • Target function takes on a discrete number of values. • A DNF representation is effective in representing the target concept. • Data may have errors. Version 2 CSE IIT, Kharagpur