4. Supervised Learning (and some unsupervised)

2021: this page has now been updated and is ready to go

Learning Goals: You should have heard about support vector machines. You should understand Bayesian learning and be able to use  Bayes' formula. You should understand the ideas of Gaussian Mixture Models and how it is used in the LDA and QDA algorithms. You should be able to describe how one can  learn from unlabeled data and how the EM and k-means algorithms work. You should be able to explain the singular value decomposition (SVD) and how it relates to principal component analysis  (PCA).

 

You find the Lecture 4 slides here Download Lecture 4 slides here

You find the Lecture 4 video here

or on Youtube here: https://youtu.be/C2nhmFWbNAY Links to an external site.

Code used on lecture

bayes.m Download bayes.m

pcaMNIST.ipynb (Links to an external site.)

kmeans.ipynb (Links to an external site.)

mixedmodel.m Download mixedmodel.m

mixedmodelunsupervised.m Download mixedmodelunsupervised.m

pcacancer.m Download pcacancer.m

 

Some additional video

What are A priori and  A posteriori - Gentleman Thinker (Links to an external site.) What are 'A priori & A posteriori?' - Gentleman Thinker