2. Contents
•Introduction to decision trees
•What is a decision tree stump?
•CART VS CHAID
•Criterion for splitting
•Building a decision tree stump macro
•Linking the tree up
•Conclusion
5. CART VS CHAID
•Easier to understand splits
oBinary splits are easier to understand
oCan be phrased as an either or statement
•Able to handle different data types
oCART is able to handle nominal, categorical and
missing values simultaneously unlike CHAID.
6. CART VS CHAID
•More robust statistics
oCHAID uses chi square test which is size dependent
and suffers from multiple comparison test deficiency.
oBenferroni adjustment does not fully compensate for the
deficiency.
•Less dispersion effects
oMultiple splits in a single node results in smaller
subsequent nodes that may cause severe skewness in
validation.
7. Splitting criterion
•Gini impurity is the measure of how frequently a
randomly chosen element from a set is
incorrectly labeled if it were labeled randomly
according to the distribution of labels in the
subset.