Section 2
Support Vector Machines
7. Introducing: Support Vector Machines
07:44 (Preview)
8. Support Vector Machines to Maximise Decision Margins 📂
25:06
9. A Code Walkthrough for SVMs 📂
32:55
10. Overlapping Classes and Kernel SVMs 📂
21:06
11. Experimenting with Overlapping Class Distributions 📂
25:33
12. Using Kernel SVMs for Non-Linear Predictions 📂
11:36
13. Support Vector Machines in the Wild 📂
17:16
14. Solving Regression Problems with SVMs
22:37
15. Comparing Least-Squares with SVM Regression 📂
56:07
16. Conclusion, Certificate, and What Next?
04:39
1. Don't Be Fooled by the Kernel Trick
The Kernel Trick

Welcome to the first section of the course. Our goal for this section is to start off with a basic linear model and apply the kernel trick to derive an advanced non-parametric analogue.

We shall cover:

  • Projecting data into higher dimensions to obtain linear separability
  • Using the kernel trick to derive the Kernel Ridge Regression (KRR) model
  • Experimenting with KRR models on different data sets
  • Identifying and choosing suitable hyperparameters and deploying the model
Next Lesson
2. Projecting data features into higher dimensions