Lecture 6 - Linear Mappings and Least Squares Problems
Lecture 6 slides Download Lecture 6 slides
The last lecture treats bounded linear maps, Hilbert spaces, and some applications to control theory. In an attempt to keep the videos light I have tried to introduce the concepts in a more intuitive manner, with few definitions and not so much rigour. You should definitely consult the slides for a more principled approach. Be warned, there is quite a steep learning curve here, but hopefully a rewarding one if you put in the time! No reading suggestions this time, though you can find some more info about scalar products Links to an external site. and Hilbert spaces Links to an external site. on Wikipedia.
The covered material is treated in the hand-in exercise 2.3.
Thank you for taking the course, and don't forget to fill out the CEQ when you get it. Your responses are more important than ever in these chaotic remote teaching times!
Controllable and Observable Subspaces
We begin by reviewing the key subspaces related to the matrix equation y=Ax (column space, row space, null space and left null space). We then relate these to the unreachable and/or unobservable parts of the state space of an uncontrollable and/or unobservable part of a state-space model.
Least Norm Problems
We now shift focus to optimisation and optimal control. The central object of study is still the linear equation y=Ax, but we now try to determine how we should solve it when this equation has many solutions (ie which
x should we pick). One choice is to pick the smallest
x. We show how to do this for a natural (and convenient) notion of size.
We now connect this problem to an optimal control problem. In particular when studying controllability we saw that there might be many control inputs that drive a state-space system from an initial condition to the origin. But which input uses the least energy? Since our system dynamics are linear, this is completely analogous to the matrix least squares problem we just solved! We show how to extend the least squares method (our input is now a function rather than a vector) to solve this problem
More Least Squares
We now return once more to the equation y=Ax, but instead study the situation when this equation has no solutions. This time we investigate how to pick
x to make
Ax as close as possible to
y.
This type of least squares problem also has strong connections to optimal control. This time problems related to observer design are more natural (estimating the state that best fits the observed output). We study an example, again showing that the same ideas hold for more general linear mappings than just the matrix equation y=Ax.