Optimization for Learning
Course program
The course program contains the schedule and relevant information regarding lectures, exercises, deadlines etc. All relevant information will also be posted on this canvas page.
Covid-19
We will follow the department's general Covid-19 teaching policy. Lectures will be online and we will have two online and two on-site exercise sessions per week. See the course program for more info.
Before the course
Mathematical prerequisites. The course is fairly mathematical. We will in the teaching assume that you feel reasonably comfortable with the content of this mathematical prerequisites document.
Coding environments. We will program in Julia and in Python.
Weekly schedule
- Lectures (Pontus Giselsson):
- Mondays 13:15-15:00 and Wednesdays 13:15-15:00 - https://lu-se.zoom.us/j/2834723313
- Exercise sessions:
- Tuesdays and Thursdays 8:15-10:00 (Hamed Sadeghi):
- Tuesdays and Thursdays 15:15-17:00 (Manu Upadhyaya):
- KC:M M1
Lecture videos
Some lecture videos (that are short flipped classroom type videos, not lecture recordings) can be found here. The videos are intended for active listening. This means pausing, skipping 15 s back (or forward) within video to repeat, e.g., an argument, skipping between videos to recall concepts, changing playback speed, and maybe taking notes and verifying calculations while watching.
Lecture videos for the remaining lectures are recorded during live Zoom presentations and can be found here.
Lecture slides
- L0 - Intro
- L1 - Convex sets (videos)
- L2 - Convex functions (videos)
- L3 - Subdifferentials and the proximal operator (videos/videos)
- L4 - Conjugate functions and duality (videos/videos)
- L5 - Proximal gradient method - Basics (videos)
- L6 - Least squares (lecture video from last year)
- L7 - Logistic regression (lecture video from last year)
- L8 - Support vector machines (lecture video from last year)
- L9 - Deep learning
- L10 - Algorithms and convergence (videos)
- L11 - Proximal gradient method - Theory (videos)
- L12 - SGD - Qualitative convergence
- L13 - SGD - Implicit regularization
- Recap (video part 2, unfortunately part 1 was not recorded due to technical issues)
- Previous exam (2019-10-28) (video part 2, unfortunately forgot to record part 1)
Extra material (not in course this year)
Lecture recordings
Recordings of some discussion sessions:
- DS2 - convex functions, subdifferentials
Recording of algorithm overview lecture (not part of course this year)
Exercise material
Suggested exercises
Session | Exercise set |
Before | Mathematical prerequisites |
E1-E2 | Introduction to Julia and Ch 1: 1-9, 12-20 |
E3-E4 | Ch 1: 21-26, 31-32, 36, Ch 2: 1-6, 15-16, 8-10 |
E5-E6 | Ch 2: 11-13, Ch 3: 1-2, 4-7, 10-15, 17 |
E7-E8 | Ch 3: 16, 18-21 Ch 4: 1-9 |
E9-E10 | Introduction to Python and Ch 5: 1-9 |
E11-E12 | Ch 6: 1-10 |
E13-E14 | Ch 7: 1-7 |
Assignments
If your submission is not passed: Two resubmissions are allowed on the first assignment. Only one resubmission is allowed on the second assignment. The resubmission deadlines (one week after we are done grading the particular assignment) will be posted as notifications here in Canvas.
Exam
- 2021-10-26 (with solutions)
- 2020-10-26 (with solutions)
- 2019-10-28 (with solutions)
- Test exam (with solutions)
Contact information
Mika Nishimura | Ladok administrator | mika.nishimura@control.lth.se | |
Pontus Giselsson | Course responsible | pontusg@control.lth.se | |
Manu Upadhyaya | Teaching assistant | manu.upadhyaya@control.lth.se | |
Hamed Sadeghi | Teaching assistant | hamed.sadeghi@control.lth.se |
Course representatives: Henrik Paldan, Andre Rath