Random Forests
Random Forests
A single decision tree can overfit. The solution? Use many trees and let them vote. That's the idea behind Random Forests.
The Wisdom of Crowds
Loading Python Playground...
How Random Forests Work
Loading Python Playground...
Bagging: Bootstrap Aggregating
Loading Python Playground...
Feature Randomness
Loading Python Playground...
Random Forest vs Single Tree
Loading Python Playground...
Feature Importance
Loading Python Playground...
Key Takeaways
- Random Forests = many decision trees voting together
- Bagging: each tree trains on random sample of data
- Feature randomness: each split considers random subset of features
- Diversity + aggregation = reduced overfitting
- Provides feature importance rankings
- Generally more accurate than single trees
- Trade-off: less interpretable than single tree
Next, we'll learn how to properly evaluate our models!
Quiz
Question 1 of 520% Complete
0 of 5 questions answered

