3. Supervised Learning

The material on this page is now updated for 2021, and a summary of the zoom session is included below.

 

Learning Goals: You should understand the basics of bagging and how to train random forests.  You should also be able to explain the kernel trick and how it can be used to generalize many algorithms, such as ridge regression. You should be able to run and make modifications to basic matlab and python (scikit learn) code for such algorithms.

Lecture 3 slides Download Lecture 3 slides

Lecture 3 video:

Also available at youtube: https://youtu.be/4nxBGiJpn_I Links to an external site.

Reading Assignment (this refers to the book Supervised Machine Learning (Links to an external site.) by Lindholm et al):

  • Ch 7.1-7.2 Bagging, Random Forests
  • Ch 8 Nonlinear Input Transformations and Kernels

Code used in the lecture:

A correction: At the end of the lecture, it should say that you can  start with Exercise 3 now (you need to wait with Exercise 4 until after you have studied Lecture 4.)

Zoom lecture 2021-09-13 summary: Here are the Lecture2quiz Download Lecture2quiz and Lecture3quiz Download Lecture3quiz. If you were not on the meeting you can see what we discussed. We also run and discussed the *.ipynb-code available for Lecture2 and Lecture3. And I also showed and recommended you have a look at these Data Science Cheat-sheets Links to an external site., that might be useful for your coding work. I also briefly introduced the lab1, the music taste prediction competition.

Boosting

Additional Material on supervised learning is available here