Lecture 4. Supervised Learning 4 (and some unsupervised learning)
Learning Goals: You should have heard about support vector machines. You should understand Bayesian learning and be able to use Bayes' formula. You should understand the ideas of Gaussian Mixture Models and how it is used in the LDA and QDA algorithms. You should be able to describe how one can learn from unlabeled data and how the EM and k-means algorithms work. You should be able to explain the singular value decomposition (SVD) and how it relates to principal component analysis (PCA).
You find the Lecture 4 slides here
Download Lecture 4 slides here
You find the Lecture 4 video here Download Lecture 4 video here
or on Youtube here: https://youtu.be/C2nhmFWbNAY Links to an external site.
Code used on lecture
svc_iris.ipynb Links to an external site.
pcaMNIST.ipynb (Links to an external site.)
kmeans.ipynb (Links to an external site.)
mixedmodel.m Download mixedmodel.m
mixedmodelunsupervised.m Download mixedmodelunsupervised.m
pcacancer.m Download pcacancer.m
The following is a very nice illustration of using SVD to find structure in data (in this case images).
https://colah.github.io/posts/2014-10-Visualizing-MNIST/ Links to an external site.
For further illustration of use of the SVD I recommend the material about "eigenfaces" available under 1-4 Additional Material - Supervised Learning