2. Supervised Learning
2021: This page has now been updated.
Learning Goals: You should know the concepts and basic mathematics of logistic regression and how it can be used to learn how to do classification. You should be familiar with different classification metrics, such as ROC and AUC. You should also know how to train decision trees . You should be able to run and make modifications to basic matlab and python (scikit learn) code for such algorithms.
Lecture 2 slides Download Lecture 2 slides
Lecture 2 video:
Alternative Youtube link: https://youtu.be/zG4YiGi0HEQ
Links to an external site.
Reading Assignment (this refers to the book Supervised Machine Learning (Links to an external site.) by Lindholm et al):
- Ch 2.3 Decision Trees
- Ch 3.2 Classification and Logistic Regression
- Ch 4.5 Evaluation for Classification Problems
Code used in the lecture: (I realise the file names are unlogical; hope you can live with it...)
- iris_setosa.ipynb Links to an external site.(logistic regression on Iris data-binary case)
- iris_multiclass.ipynb Links to an external site. (logistic regression - multiclass case)
- decisiontree_iris.ipynb (Links to an external site.) (decision trees on Iris data)
- decisiontree_cosinus.ipynb (Links to an external site.) (decision tree regression)
Zoom session: On the lecture zoom session Sep 7 (not recordedd) we did a Kahoot quiz on the Lecture 1 material, which was won by Anton (congrats, your coffee mug can be picked up later), closely followed by Davida and Joey. We then talked about the material that people had most problems with according to the quiz.
To illustrate the linear regression method and the power of the LASSO regularization I then showed two additional notebooks:
- One on compressed sensing recovering a very sparsely sample audio signal. Code: compressedsensing.ipynb Links to an external site.
- One on face recognition of 20 persons through matching with a Training set of 600 prerecorded images. Code: images. faces_compressedsensing.ipynb Links to an external site.