Lecture 2. Supervised Learning 2
Learning Goals: You should know the concepts and basic mathematics of logistic regression and how it can be used to learn how to do classification. You should be familiar with different classification metrics, such as ROC and AUC. You should also know how to train decision trees . You should be able to run and make modifications to basic matlab and python (scikit learn) code for such algorithms.
Lecture 2 slides Download Lecture 2 slides
Lecture 2 video:
Alternative Youtube link: https://youtu.be/zG4YiGi0HEQ
Links to an external site.
Reading Assignment (this refers to the book Supervised Machine Learning (Links to an external site.) by Lindholm et al):
- Ch 2.3 Decision Trees
- Ch 3.2 Classification and Logistic Regression
- Ch 4.5 Evaluation for Classification Problems
Code used in the lecture: (I realise the file names are unlogical; hope you can live with it...)
- iris_setosa.ipynb Links to an external site.(logistic regression on Iris data-binary case)
- iris_multiclass.ipynb Links to an external site. (logistic regression - multiclass case)
- decisiontree_iris.ipynb (Links to an external site.) (decision trees on Iris data)
- decisiontree_cosinus.ipynb (Links to an external site.) (decision tree regression)
-------------------------------------------------------------------------------------------
Lecture session plan:
Will be updated after the lecture !
Mentimeter quiz on Lecture 1
Q/A + recap of most muddiest points on Lecture 1.
To illustrate the linear regression method and the power of the LASSO regularization I show two additional notebooks:
- One on face recognition of 20 persons through matching with a training set of 600 prerecorded images. Code: images. faces_compressedsensing.ipynb Links to an external site.
- One on compressed sensing recovering a very sparsely sample audio signal. Code: compressedsensing.ipynb Links to an external site.
Possible bonus: Initial recap of Lecture 2 (if time permits)