Course syllabus

Welcome to the course pages for EDAN95 - Applied Machine Learning!

General information

The course started on Monday, November 1st, all places will be filled according to the rules regarding prerequisites and ranking (see official course plan in Swedish or English).

Details about the course in HT2021 will be added here, for previous course runs you can have a look at the course home page or the HT2020 syllabus page here in Canvas.

Master thesis projects

can be found in connection to the RSS (Robotics and Semantic Systems) division, which includes also the ML research group.

Remote attendance:

Some lectures (OBS: NOT lectures 7-10) are sent through Zoom, the link can be found HERE.

Otherwise: Lectures will be held on campus, contact with teachers and TAs in exercise / lab sessions will be mostly handled in computer rooms (default) but with an online option (for you who should avoid public spaces, not only Covid-19 is nasty) in a respective Discord server. The Discord server will also be available 24/7 for you to discuss with your peers, and will be used for the only online "office hours / extra periods". 

In addition, we will use this platform (Canvas) as the overall course platform to distribute information and course material - and of course it also offers opportunity for discussions with your peers.

Syllabus

Sessions and assignments: The course has 14 lectures, 7 programming assignments (labs) that should be worked on in pairs, and 4 individual homework assignments (reports).

Modules: The course has 15 Canvas modules, one for each lecture and related assignments, plus one for general information and preparation / recap material (an intro to Python (book chapter), a Python tutorial (Jupyter notebook), a "Lab 0" exercise, and a pre-recorded video "Lecture 1.5" with a recap on linear algebra).

Please go to https://sam.cs.lth.se/LabsSelectSession?occasionId=731 (Links to an external site.) to register with your lab partner for a lab session by Wednesday, November 3rd, 15:00. When registering alone, the system will pair you up automatically.

Lectures will be held by four teachers, Elin A. Topp (EAT, course responsible), Luigi Nardi, (LN), Pierre Nugues (PN), and Volker Krueger (VOK).

Teaching assistants: For the lab sessions and to support you in general with the course material, we have several TAs, all are at least PhD students, some are on PostDoc level: Carl Hvarfner, Dennis Medved, Erik Gärtner, Faseeh Ahmad, Hampus Åström, Konstantin Malysh, Leonard Papenmeier, Simon Kristoffersson Lind.

Contact information and office hours for teachers and TAs can be found here.

Examination: If you are registered or re-registered for course instances HT2019 and later, fulfilling all assignments (programming assignments in pairs, homework assignments individually) will give the full course credits (7.5 ECTS) and a passing grade "3", as well as allow you to take the optional exam, with which it is then possible to get a higher grade ("4", or "5").
If you are re-registered for course instance HT2018, fulfilling all programming assignments (in pairs) will give a part of the course credits (4.5 ECTS). To pass the entire course (7.5 ECTS) you need to also pass the exam. The grade achieved in the exam holds then for the entire course (grade scale "U/3,4,5,").
If you are a PhD student registered for the course, fulfilling all assignments (programming assignments in small groups including discussing them in a seminar session (upon agreement), homework assignments individually) will give the full course credits (7.5 ECTS) and a passing grade (G for "godkänd / passed") on the U/G scale. 

Overview over the course events per course / calendar week (will be updated soon):

week / cw Lectures (topics)

Programming assignments (Labs)

Homework announced Homework due
1 / 45 1/11: Introduction (EAT)
3/11: Probability, Likelihood, Conjugacy (LN)

4+5/11: Evaluation tools, SKLearn

1/11: Report on Evaluation tools
2 / 46 8/11:Regression, MLE, MAP, Overfitting, Model selection (LN)
10/11: Intro to Information Theory, Entropy, KL Divergence, Information Gain (LN)

11+12/11: Probability distributions, MLE

3 / 47 15/11: Bayesian Learning, NCC / NBC (EAT)
17/11: Bayesian Learning 2, EM (EAT)

18+19/11: Information Gain / Gini

16/11: Report on Bayesian methods 18/11: Report on Evaluation tools
4 / 48 22/11: Perceptron, feed forward networks, Keras (PN)
24/11: Convolutional NNs (PN)

25+26/11: Bayesian Classifiers

5 / 49 29/11: Recurrent NNs (PN)
1/12: LSTMs, GRU (PN)

2+3/12: Image Classification with CNNs

29/11: Report on image classification

1/12: Report on Sequence tagging

6 / 50 6/12: MDPs, Intro to RL (VK)
8/12:  Reinforcement Learning (VK)

9+10/12: Sequence tagging

6/12: Report on Bayesian methods
7 / 51

13/12: Decision trees, ensemble methods (random forests) (EAT)
15/12: ML in systems, summary and outlook (EAT)

16+17/12: Decision tree implementation OR Reinforcement Learning

13/12: Report on image classification

8 / 52

 

 

20/12: Report on Sequence tagging

 

Course summary:

Date Details Due