SlideShare uma empresa Scribd logo
1 de 41
Baixar para ler offline
Decision Tree
HANSAM CHO
GROOT SEMINAR
Definition
Decision tree learning is a method commonly used in data mining. The goal is to
create a model that predicts the value of a target variable based on several input
variables.
Issues
1. How to split the training records → Impurity measure, Algorithm
2. When to stop splitting → Stopping condition, Pruning
Impurity Measure
• Splitting의 결과가 얼마나 좋은지에 대한 평가 척도 (homogeneity)
• Misclassification error
• Gini impurity
• Information gain
• Variance reduction
Misclassification error
Misclassification error
Gini impurity
• Used by the CART (classification and regression tree) algorithm for classification trees
• Gini impurity is a measure of how often a randomly chosen element from the set
would be incorrectly labeled if it was randomly labeled according to the distribution
of labels in the subset.
뽑힌 element가 특정 클래스에 속할 확률 잘못 분류될 확률
Gini impurity
Information gain
• Used by the ID3, C4.5 and C5.0
• Information (사건의 확률이 낮을 수록 높은 정보를 가지고 있다 / 로또 예시)
• Entropy (expectation of information) / Deviance
Information gain
• Information gain
Information gain
Variance reduction
• Introduced in CART, variance reduction is often employed in cases where the target
variable is continuous (regression tree)
Algorithm
•Split의 결과가 얼마나 좋은지에 대한 척도 – Impurity measure
•어떻게 나눌까 - Algorithm
• ID3
• C 4.5
• C 5.0
• CART
ID3 - algorithm
• Calculate the entropy of every attribute a of the data set S.
• Partition ("split") the set S into subsets using the attribute for which the resulting
entropy after splitting is minimized; or, equivalently, information gain is maximum
• Make a decision tree node containing that attribute.
• Recurse on subsets using the remaining attributes.
ID3 - example
ID3 – stopping condition
• Every element in the subset belongs to the same class; in which case the node is
turned into a leaf node and labelled with the class of the examples.
• There are no more attributes to be selected, but the examples still do not belong to
the same class. In this case, the node is made a leaf node and labelled with the most
common class of the examples in the subset.
• There are no examples in the subset, which happens when no example in the parent
set was found to match a specific value of the selected attribute. An example could be
the absence of a person among the population with age over 100 years. Then a leaf
node is created and labelled with the most common class of the examples in the
parent node's set.
C 4.5 – Information gain ratio
• A notable problem occurs when information gain is applied to attributes that can take
on a large number of distinct values. (ex. 고객번호 / overfitting)
•Information gain ratio
• Intrinsic value (많이 쪼개는 것에 대한 패널티, 쪼개는 것에 대한 엔트로피)
C 4.5 - Improvements from ID.3
algorithm
C 4.5 – Handling continuous attribute
(mid-point)
Pruning
•Pre-pruning / Post-pruning
•Reduced error pruning
• Subtree 제거했을 때 성능 차이가 없다면 pruning
•Cost complexity pruning
C 5.0
CART
• 기본적인 컨셉은 C 4.5와 유사
• Regression 가능
• Binary split
• Choose attribute recursively
• Classification – Gini impurity / Regression – Variance reduction
Ensemble
Bootstrap Aggregation (Bagging)
◦ Random Forest
Boosting
◦ AdaBoost
◦ Gradient Boosting
Bootstrap Aggregation (Bagging)
• Bootstrapping
• Given a standard training set D of size n, bagging generates m new training sets Di, each of
size n′, by sampling from D uniformly and with replacement. By sampling with replacement,
some observations may be repeated in each Di. If n′=n, then for large n the set Di is
expected to have the fraction (1 - 1/e) (≈63.2%) of the unique examples of D, the rest being
duplicates.
• Aggregation
• This kind of sample is known as a bootstrap sample. Then, m models are fitted using the
above m bootstrap samples and combined by averaging the output (for regression) or
voting (for classification)
lim
𝑛→∞
1 − 1 −
1
𝑛
𝑛
Random Forest – random subspace
• Random forests differ in only one way from this general scheme: they use a modified
tree learning algorithm that selects, at each candidate split in the learning process, a
random subset of the features. This process is sometimes called "feature bagging".
• The reason for doing this is the correlation of the trees in an ordinary bootstrap
sample: if one or a few features are very strong predictors for the response variable
(target output), these features will be selected in many of the B trees, causing them to
become correlated.
Extra-Trees
• Its two main differences with other tree based ensemble methods are that it splits
nodes by choosing cut-points fully at random and that it uses the whole learning
sample (rather than a bootstrap replica) to grow the trees.
• 𝑛 𝑚𝑖𝑛 : the minimum sample size for splitting a node
• 비슷한 성능을 유지하면서 computational cost감소
Random Forest – Variable importance
Boosting
Boosting algorithms consist of iteratively learning weak classifiers with respect to a
distribution and adding them to a final strong classifier. When they are added, they
are typically weighted in some way that is usually related to the weak learners'
accuracy
AdaBoost
AdaBoost Algorithm
AdaBoost
https://www.youtube.com/watch?v=LsK-xG1cLYA
Decision Stump (Weak learner)
AdaBoost
error↓ → α↑
Logit function
error가 0또는 1
AdaBoost
AdaBoost
AdaBoost
Weighted Gini Index
Bootstrapping
AdaBoost
Exponential error
AdaBoost
m-1번째까지 모델이 만들어져 있고 m번째 weak learner 추가하는 과정을 가정
𝛼 𝑚, 𝑦 𝑚에 대해서만 최적화 시행
𝑇 𝑚 : 𝑦 𝑚에 의해 정확히 분류된 포인트 / 𝑀 𝑚 : 𝑦 𝑚에 의해 잘못 분류된 포인트
AdaBoost
14.23을 𝑦 𝑚에 대해 minimize, 𝛼 𝑚상수 취급 / 14.15식 유도
AdaBoost
14.23을𝛼 𝑚에 대해 minimize / 14.17식 유도
https://en.wikipedia.org/wiki/AdaBoost
AdaBoost
exp(−
𝛼 𝑚
2
)는 n에 대해 독립적이기 때문에 제거 가능
Gradient Boosting
Gradient Boosting Algorithm
이후…
Xgboost
LightGBM
Catboost
Optimal decision tree

Mais conteúdo relacionado

Mais procurados

Decision tree lecture 3
Decision tree lecture 3Decision tree lecture 3
Decision tree lecture 3
Laila Fatehy
 
Understanding random forests
Understanding random forestsUnderstanding random forests
Understanding random forests
Marc Garcia
 
Random forest
Random forestRandom forest
Random forest
Ujjawal
 
Machine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree LearningMachine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree Learning
butest
 
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Simplilearn
 

Mais procurados (20)

Decision tree lecture 3
Decision tree lecture 3Decision tree lecture 3
Decision tree lecture 3
 
Decision Tree Learning
Decision Tree LearningDecision Tree Learning
Decision Tree Learning
 
Understanding random forests
Understanding random forestsUnderstanding random forests
Understanding random forests
 
Random forest
Random forestRandom forest
Random forest
 
Understanding Bagging and Boosting
Understanding Bagging and BoostingUnderstanding Bagging and Boosting
Understanding Bagging and Boosting
 
Decision tree
Decision treeDecision tree
Decision tree
 
Classification in Data Mining
Classification in Data MiningClassification in Data Mining
Classification in Data Mining
 
Chapter 4 Classification
Chapter 4 ClassificationChapter 4 Classification
Chapter 4 Classification
 
Classification
ClassificationClassification
Classification
 
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
 
Random Forest Classifier in Machine Learning | Palin Analytics
Random Forest Classifier in Machine Learning | Palin AnalyticsRandom Forest Classifier in Machine Learning | Palin Analytics
Random Forest Classifier in Machine Learning | Palin Analytics
 
Decision tree
Decision treeDecision tree
Decision tree
 
Random forest algorithm
Random forest algorithmRandom forest algorithm
Random forest algorithm
 
Machine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree LearningMachine Learning 3 - Decision Tree Learning
Machine Learning 3 - Decision Tree Learning
 
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
 
Decision Trees for Classification: A Machine Learning Algorithm
Decision Trees for Classification: A Machine Learning AlgorithmDecision Trees for Classification: A Machine Learning Algorithm
Decision Trees for Classification: A Machine Learning Algorithm
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceDecision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
 
Machine Learning - Accuracy and Confusion Matrix
Machine Learning - Accuracy and Confusion MatrixMachine Learning - Accuracy and Confusion Matrix
Machine Learning - Accuracy and Confusion Matrix
 
Random Forest
Random ForestRandom Forest
Random Forest
 
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
Machine learning basics using trees algorithm (Random forest, Gradient Boosting)
 

Semelhante a Decision tree

Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdfMachine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
AdityaSoraut
 
모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로
모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로 모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로
모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로
r-kor
 
Bank loan purchase modeling
Bank loan purchase modelingBank loan purchase modeling
Bank loan purchase modeling
Saleesh Satheeshchandran
 

Semelhante a Decision tree (20)

Chapter 4.pdf
Chapter 4.pdfChapter 4.pdf
Chapter 4.pdf
 
data mining.pptx
data mining.pptxdata mining.pptx
data mining.pptx
 
Machine learning session6(decision trees random forrest)
Machine learning   session6(decision trees random forrest)Machine learning   session6(decision trees random forrest)
Machine learning session6(decision trees random forrest)
 
20211229120253D6323_PERT 06_ Ensemble Learning.pptx
20211229120253D6323_PERT 06_ Ensemble Learning.pptx20211229120253D6323_PERT 06_ Ensemble Learning.pptx
20211229120253D6323_PERT 06_ Ensemble Learning.pptx
 
Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests
 
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdfMachine Learning Unit-5 Decesion Trees & Random Forest.pdf
Machine Learning Unit-5 Decesion Trees & Random Forest.pdf
 
5.Module_AIML Random Forest.pptx
5.Module_AIML Random Forest.pptx5.Module_AIML Random Forest.pptx
5.Module_AIML Random Forest.pptx
 
Machine Learning Algorithm - Decision Trees
Machine Learning Algorithm - Decision Trees Machine Learning Algorithm - Decision Trees
Machine Learning Algorithm - Decision Trees
 
module_3_1.pptx
module_3_1.pptxmodule_3_1.pptx
module_3_1.pptx
 
module_3_1.pptx
module_3_1.pptxmodule_3_1.pptx
module_3_1.pptx
 
Lecture 9 - Decision Trees and Ensemble Methods, a lecture in subject module ...
Lecture 9 - Decision Trees and Ensemble Methods, a lecture in subject module ...Lecture 9 - Decision Trees and Ensemble Methods, a lecture in subject module ...
Lecture 9 - Decision Trees and Ensemble Methods, a lecture in subject module ...
 
Boosting Algorithms Omar Odibat
Boosting Algorithms Omar Odibat Boosting Algorithms Omar Odibat
Boosting Algorithms Omar Odibat
 
Introduction to Random Forest
Introduction to Random Forest Introduction to Random Forest
Introduction to Random Forest
 
모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로
모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로 모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로
모듈형 패키지를 활용한 나만의 기계학습 모형 만들기 - 회귀나무모형을 중심으로
 
Bank loan purchase modeling
Bank loan purchase modelingBank loan purchase modeling
Bank loan purchase modeling
 
Classification
ClassificationClassification
Classification
 
Ensemble learning Techniques
Ensemble learning TechniquesEnsemble learning Techniques
Ensemble learning Techniques
 
CS109a_Lecture16_Bagging_RF_Boosting.pptx
CS109a_Lecture16_Bagging_RF_Boosting.pptxCS109a_Lecture16_Bagging_RF_Boosting.pptx
CS109a_Lecture16_Bagging_RF_Boosting.pptx
 
Decision Tree.pptx
Decision Tree.pptxDecision Tree.pptx
Decision Tree.pptx
 
Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018Random forest sgv_ai_talk_oct_2_2018
Random forest sgv_ai_talk_oct_2_2018
 

Mais de SEMINARGROOT

Mais de SEMINARGROOT (20)

Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learning
 
Sampling method : MCMC
Sampling method : MCMCSampling method : MCMC
Sampling method : MCMC
 
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style Transfer
 
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.
 
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embedding
 
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional Network
 
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency Domain
 
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian Statistics
 
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3
 
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMA
 
Differential Geometry for Machine Learning
Differential Geometry for Machine LearningDifferential Geometry for Machine Learning
Differential Geometry for Machine Learning
 
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GAN
 
Effective Python
Effective PythonEffective Python
Effective Python
 
Understanding Blackbox Prediction via Influence Functions
Understanding Blackbox Prediction via Influence FunctionsUnderstanding Blackbox Prediction via Influence Functions
Understanding Blackbox Prediction via Influence Functions
 
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You Need
 
Attention
AttentionAttention
Attention
 
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial Review
 
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2
 
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashing
 
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1
 

Último

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
1301aanya
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptx
seri bangash
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learning
levieagacer
 
Digital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxDigital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptx
MohamedFarag457087
 
CYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptxCYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptx
Silpa
 
development of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virusdevelopment of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virus
NazaninKarimi6
 

Último (20)

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort ServiceCall Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
 
Factory Acceptance Test( FAT).pptx .
Factory Acceptance Test( FAT).pptx       .Factory Acceptance Test( FAT).pptx       .
Factory Acceptance Test( FAT).pptx .
 
GBSN - Biochemistry (Unit 2) Basic concept of organic chemistry
GBSN - Biochemistry (Unit 2) Basic concept of organic chemistry GBSN - Biochemistry (Unit 2) Basic concept of organic chemistry
GBSN - Biochemistry (Unit 2) Basic concept of organic chemistry
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
 
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxPSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
 
Use of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptxUse of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptx
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its Functions
 
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRLGwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptx
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learning
 
Clean In Place(CIP).pptx .
Clean In Place(CIP).pptx                 .Clean In Place(CIP).pptx                 .
Clean In Place(CIP).pptx .
 
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body GBSN - Microbiology (Unit 3)Defense Mechanism of the body
GBSN - Microbiology (Unit 3)Defense Mechanism of the body
 
Digital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxDigital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptx
 
CYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptxCYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptx
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical Science
 
Cyanide resistant respiration pathway.pptx
Cyanide resistant respiration pathway.pptxCyanide resistant respiration pathway.pptx
Cyanide resistant respiration pathway.pptx
 
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.Molecular markers- RFLP, RAPD, AFLP, SNP etc.
Molecular markers- RFLP, RAPD, AFLP, SNP etc.
 
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and SpectrometryFAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
FAIRSpectra - Enabling the FAIRification of Spectroscopy and Spectrometry
 
development of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virusdevelopment of diagnostic enzyme assay to detect leuser virus
development of diagnostic enzyme assay to detect leuser virus
 

Decision tree

  • 2. Definition Decision tree learning is a method commonly used in data mining. The goal is to create a model that predicts the value of a target variable based on several input variables.
  • 3. Issues 1. How to split the training records → Impurity measure, Algorithm 2. When to stop splitting → Stopping condition, Pruning
  • 4. Impurity Measure • Splitting의 결과가 얼마나 좋은지에 대한 평가 척도 (homogeneity) • Misclassification error • Gini impurity • Information gain • Variance reduction
  • 7. Gini impurity • Used by the CART (classification and regression tree) algorithm for classification trees • Gini impurity is a measure of how often a randomly chosen element from the set would be incorrectly labeled if it was randomly labeled according to the distribution of labels in the subset. 뽑힌 element가 특정 클래스에 속할 확률 잘못 분류될 확률
  • 9. Information gain • Used by the ID3, C4.5 and C5.0 • Information (사건의 확률이 낮을 수록 높은 정보를 가지고 있다 / 로또 예시) • Entropy (expectation of information) / Deviance
  • 12. Variance reduction • Introduced in CART, variance reduction is often employed in cases where the target variable is continuous (regression tree)
  • 13. Algorithm •Split의 결과가 얼마나 좋은지에 대한 척도 – Impurity measure •어떻게 나눌까 - Algorithm • ID3 • C 4.5 • C 5.0 • CART
  • 14. ID3 - algorithm • Calculate the entropy of every attribute a of the data set S. • Partition ("split") the set S into subsets using the attribute for which the resulting entropy after splitting is minimized; or, equivalently, information gain is maximum • Make a decision tree node containing that attribute. • Recurse on subsets using the remaining attributes.
  • 16. ID3 – stopping condition • Every element in the subset belongs to the same class; in which case the node is turned into a leaf node and labelled with the class of the examples. • There are no more attributes to be selected, but the examples still do not belong to the same class. In this case, the node is made a leaf node and labelled with the most common class of the examples in the subset. • There are no examples in the subset, which happens when no example in the parent set was found to match a specific value of the selected attribute. An example could be the absence of a person among the population with age over 100 years. Then a leaf node is created and labelled with the most common class of the examples in the parent node's set.
  • 17. C 4.5 – Information gain ratio • A notable problem occurs when information gain is applied to attributes that can take on a large number of distinct values. (ex. 고객번호 / overfitting) •Information gain ratio • Intrinsic value (많이 쪼개는 것에 대한 패널티, 쪼개는 것에 대한 엔트로피)
  • 18. C 4.5 - Improvements from ID.3 algorithm
  • 19. C 4.5 – Handling continuous attribute (mid-point)
  • 20. Pruning •Pre-pruning / Post-pruning •Reduced error pruning • Subtree 제거했을 때 성능 차이가 없다면 pruning •Cost complexity pruning
  • 21. C 5.0
  • 22. CART • 기본적인 컨셉은 C 4.5와 유사 • Regression 가능 • Binary split • Choose attribute recursively • Classification – Gini impurity / Regression – Variance reduction
  • 23. Ensemble Bootstrap Aggregation (Bagging) ◦ Random Forest Boosting ◦ AdaBoost ◦ Gradient Boosting
  • 24. Bootstrap Aggregation (Bagging) • Bootstrapping • Given a standard training set D of size n, bagging generates m new training sets Di, each of size n′, by sampling from D uniformly and with replacement. By sampling with replacement, some observations may be repeated in each Di. If n′=n, then for large n the set Di is expected to have the fraction (1 - 1/e) (≈63.2%) of the unique examples of D, the rest being duplicates. • Aggregation • This kind of sample is known as a bootstrap sample. Then, m models are fitted using the above m bootstrap samples and combined by averaging the output (for regression) or voting (for classification) lim 𝑛→∞ 1 − 1 − 1 𝑛 𝑛
  • 25. Random Forest – random subspace • Random forests differ in only one way from this general scheme: they use a modified tree learning algorithm that selects, at each candidate split in the learning process, a random subset of the features. This process is sometimes called "feature bagging". • The reason for doing this is the correlation of the trees in an ordinary bootstrap sample: if one or a few features are very strong predictors for the response variable (target output), these features will be selected in many of the B trees, causing them to become correlated.
  • 26. Extra-Trees • Its two main differences with other tree based ensemble methods are that it splits nodes by choosing cut-points fully at random and that it uses the whole learning sample (rather than a bootstrap replica) to grow the trees. • 𝑛 𝑚𝑖𝑛 : the minimum sample size for splitting a node • 비슷한 성능을 유지하면서 computational cost감소
  • 27. Random Forest – Variable importance
  • 28. Boosting Boosting algorithms consist of iteratively learning weak classifiers with respect to a distribution and adding them to a final strong classifier. When they are added, they are typically weighted in some way that is usually related to the weak learners' accuracy
  • 31. AdaBoost error↓ → α↑ Logit function error가 0또는 1
  • 36. AdaBoost m-1번째까지 모델이 만들어져 있고 m번째 weak learner 추가하는 과정을 가정 𝛼 𝑚, 𝑦 𝑚에 대해서만 최적화 시행 𝑇 𝑚 : 𝑦 𝑚에 의해 정확히 분류된 포인트 / 𝑀 𝑚 : 𝑦 𝑚에 의해 잘못 분류된 포인트
  • 37. AdaBoost 14.23을 𝑦 𝑚에 대해 minimize, 𝛼 𝑚상수 취급 / 14.15식 유도
  • 38. AdaBoost 14.23을𝛼 𝑚에 대해 minimize / 14.17식 유도 https://en.wikipedia.org/wiki/AdaBoost
  • 39. AdaBoost exp(− 𝛼 𝑚 2 )는 n에 대해 독립적이기 때문에 제거 가능