18. What about
Boosted Trees?
Instead of averaging contributions
across trees we just have have to
sum them.
Available with:
● ELI5
e.g., XGBoost, LightGBM
27. More use cases
● Learn the right features (dogs
vs wolves)
● Understand whether model
overfits a particular feature
● Identifying data leakage
● Dataset shift (training data
different than test data)
● Pneumonia/asthma case
Amazon, Netflix
28. ● Not only useful for when things aren't working
● Different costs for making mistakes
29. References
Interpreting Random Forests
Random forest interpretation with scikit-learn
Random forest interpretation – conditional feature contributions
Interpreting Decision Trees and Random Forests
XGBoost Decision Paths
Explaining XGBoost predictions on the Titanic dataset
“Why Should I Trust You?” Explaining the Predictions of Any Classifier