Section 2
Support Vector Machines
7. Introducing: Support Vector Machines
07:44 (Preview)
8. Support Vector Machines to Maximise Decision Margins 📂
25:06
9. A Code Walkthrough for SVMs 📂
32:55
10. Overlapping Classes and Kernel SVMs 📂
21:06
11. Experimenting with Overlapping Class Distributions 📂
25:33
12. Using Kernel SVMs for Non-Linear Predictions 📂
11:36
13. Support Vector Machines in the Wild 📂
17:16
14. Solving Regression Problems with SVMs
22:37
15. Comparing Least-Squares with SVM Regression 📂
56:07
Section 3
Decision Trees
16. Introducing: Decision Trees
09:19 (Preview)
17. Decision Trees in Everyday Thinking 📂
20:29
18. Machine-Designed Decision Trees 📂
27:44
19. Classification Problems with Decision Trees: A Code Walkthrough 📂
25:55
20. Regression Problems with Decision Trees: A Code Walkthrough 📂
18:16
Section 4
Random Forests
21. Ensemble Methods: Machine Learning and Democracy
4:57 (Preview)
22. Random Forests: Decisions Don't Fall Far from the Decision Tree 📂
15:38
23. Random Forests out in the Wild 📂
36:15
24. Interpolation Through a Random Forest 📂
08:57
Section 5
Gradient Boosting
25. Give Yourself a Gradient Boost
07:01 (Preview)
26. Auto-Correction in a Forest of Stumps 📂
22:06
27. Gradient Boosting by Hand: Code Example 📂
15:55
28. XGBoost in the Wild 📂
14:41
29. Cross validate with the XGBoost API 📂
15:30
30. Conclusion, Certificate, and What Next?
05:52
25. Give Yourself a Gradient Boost
Gradient Boosting

Welcome to the fifth and final section of our course 'Machine Thinking' in which we shall introduce our last professional ML tool: Gradient Boosted Trees. A close relative of the Random Forest, boosting gradients zooms in on the convergence of an ensemble, using less complicated weak leaners--trees stripped down right to the tree stump. What does this mean? A light-weight, fast and effective algorithm.

We take a deep dive into a popular API for this technique: XGBoost, standing for 'eXtreme Gradient Boosting'. Possibly the most common deployment of a machine learning algorithm known for its predictive ability, high functionality and fine tuning.

Shining a Light on the Stumps

In this section we explore gradient boosting and XGBoost in the following:

  • Introduction to the technique of Gradient Boosted Trees.
  • Key mathematical construction for intuition.
  • Worked data example: coding the algorithm from scratch.
  • XGBoost in the wild: how does it fair on our prototype data.
  • XGBoost functionality and cross validation.
Next Lesson
26. Auto-Correction in a Forest of Stumps