Section 2
Random Forests
6. Ensemble Methods: Machine Learning and Democracy
04:57 (Preview)
7. Random Forests: Decisions Don't Fall Far from the Decision Tree 📂
15:38
8. Random Forests out in the Wild 📂
36:15
9. Interpolation Through a Random Forest 📂
08:57
Section 3
Gradient Boosting
10. Give Yourself a Gradient Boost
07:01 (Preview)
11. Auto-Correction in a Forest of Stumps 📂
22:06
12. Gradient Boosting by Hand: Code Example 📂
15:55
13. XGBoost in the Wild 📂
14:41
14. Cross validate with the XGBoost API 📂
15:30
15. Conclusion, Certificate, and What Next?
05:07
1. Introducing Decision Trees
Decision Trees

In this course we shall climb the heights of a fundamental machine learning concept: the decision tree. As a root stock for some of the most effective algorithms on the market, we take a look at the decision tree from the ground up. Applicable to both classification and regression problems, these basic tools are useful for interrogating the non-parametric decisions they produce.

Tree Thinking

We shall cover:

  • Heuristic demonstration behind decision tree logic.
  • Tree terminology: roots, leaves and splitting.
  • The machine learning approach.
  • Simple examples: classification and regression.
Next Lesson
2. Decision Trees in Everyday Thinking