I’m taking a Statistics course on the theory of linear models, which covers Gauss-Markov models and various extensions of them. Sometimes, when dealing with partitioned matrices, and commonly Multivariate Normal Distributions, we’ll often need to invert matrices in a blockwise manner. This has happened often enough during this course (coincidentally was necessary knowledge for a midterm question), so I figured I should just document some of the inversion lemmas.
Let’s define our partitioned matrix as
We specifically interested in finding
such that
Part 1:
For the right inverse (), we can define
and, assuming and are invertible,
We can plug these identities back into the first system of equations as
so that
and finally
It is important to note that the above result only holds if , , , and are invertible.
Part 2:
Following the same logic as above, we have the following systems of equations for the left inverse ()
so that
which indicates that
Importantly, blockwise matrix inversion allows us to define the inverse of a larger matrix, with respect to its subcomponents. Likewise, from here, we can go on to derive the Sherman-Morrison formula and Woodbury theorem, which allows us to do all kinds of cool stuff, like rank-one matrix updates. In the next few posts, I’ll go over a few examples of where blockwise matrix inversions are useful, and common scenarios where rank-one updates of matrices are applicable in the next few posts.