Invert a Matrix using the Woodbury Matrix Inverse Formula Identity

Previously I wrote about the LDU decomposition and the Schur complement. These can be further used to derive the Sherman–Morrison–Woodbury formula, otherwise known as the matrix inversion lemma, for inverting a matrix. As shown in the previous post, a UDL and LDU are two ways of factorizing a matrix:

\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}=    \begin{pmatrix}          1 &    V_{12}V_{22}^{-1}\\  0      & 1\\  \end{pmatrix}  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}   \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}

\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}    =  \begin{pmatrix}          1 & 0\\          V_{21}V_{11}^{-1} & 1\\  \end{pmatrix}  \begin{pmatrix}            V_{11} & 0 \\  0 & V_{22.1} \\  \end{pmatrix}  \begin{pmatrix}            1 & V_{11}^{-1}V_{12} \\  0 & 1 \\  \end{pmatrix}
Now consider taking the inverse of the matrices above, yielding

\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}^{-1}=  \begin{pmatrix}  1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}      \begin{pmatrix}           V_{11.2}^{-1} & 0 \\           0 & V_{22}^{-1} \\   \end{pmatrix}   \begin{pmatrix}          1 &   -V_{12}V_{22}^{-1}\\  0      & 1\\  \end{pmatrix}

\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}^{-1}  =  \begin{pmatrix}  1 & -V_{11}^{-1}V_{12} \\  0 & 1 \\  \end{pmatrix}  \begin{pmatrix}      V_{11}^{-1} & 0 \\  0 & V_{22.1}^{-1} \\  \end{pmatrix}    \begin{pmatrix}          1 & 0\\          -V_{21}V_{11}^{-1} & 1\\  \end{pmatrix}
Multiplying the matrices on the RHS yields
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}^{-1}  =  \begin{pmatrix}  V_{11.2}^{-1} & -V_{11.2}^{-1}V_{12}V_{22}^{-1}\\  -V_{22}^{-1}V_{21}V_{11.2}^{-1} & V_{22}^{-1}+V_{22}^{-1}V_{21}V_{11.2}^{-1}V_{12}V_{22}^{-1}  \end{pmatrix}

\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}^{-1}  =\begin{pmatrix}          V_{11}^{-1}+V_{11}^{-1}V_{12}V_{22.1}^{-1}V_{21}V_{11}^{-1} & -V_{11}^{-1}V_{12}V_{22.1}^{-1}\\          -V_{22.1}^{-1}V_{21}V_{11}^{-1} & V_{22.1}^{-1}  \end{pmatrix}
These two results can be used to form neat expressions for the inverse of a partitioned block matrix. It follows that
\begin{pmatrix}  V_{11.2}^{-1} & -V_{11.2}^{-1}V_{12}V_{22}^{-1}\\  -V_{22}^{-1}V_{21}V_{11.2}^{-1} & V_{22}^{-1}+V_{22}^{-1}V_{21}V_{11.2}^{-1}V_{12}V_{22}^{-1}  \end{pmatrix}=  \begin{pmatrix}          V_{11}^{-1}+V_{11}^{-1}V_{12}V_{22.1}^{-1}V_{21}V_{11}^{-1} & -V_{11}^{-1}V_{12}V_{22.1}^{-1}\\          -V_{22.1}^{-1}V_{21}V_{11}^{-1} & V_{22.1}^{-1}  \end{pmatrix}
The equality holds for each block or element, so two expressions can be found for the Woodbury matrix inverse formula, namely:
V_{11.2}^{-1}=V_{11}^{-1}+V_{11}^{-1}V_{12}V_{22.1}^{-1}V_{21}V_{11}^{-1}
and
V_{22.1}^{-1}=V_{22}^{-1}+V_{22}^{-1}V_{21}V_{11.2}^{-1}V_{12}V_{22}^{-1}
where the dot notation corresponds to the Schur complement i.e.
V_{11.2}=V_{11}-V_{12}V_{22}^{-1}V_{21}\\  V_{22.1}=V_{22}-V_{21}V_{11}^{-1}V_{12}\\
An application of the Woodbury matrix inverse can be found in deriving conditional distributions for multivariate normals.

Diagonalize a Positive-Definite Symmetric Matrix using the Schur Complement and LDU Decomposition

Diagonalizing a matrix comes up frequently for me when wanting to diagonalize the variance matrix of a multivariate normal to derive conditional distributions. I prefer to proceed by doing an LDU Decomposition and leaving it in terms of the Schur complement as I find it easier to remember. Consider some matrix V and partition it as follows
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}=V
Consider right and left multiplying this matrix by
\begin{pmatrix}          1 & -V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}            1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}
The matrices really correspond to the L and U matrices in the LDU decomposition, noting that the matrix on the right or left is just the transpose of the other. Define the Schur complement of the matrix V with respect to the block V22 as
V_{11.2}=V_{11}-V_{12}V_{22}^{-1}V_{21}
then working through the above multiplication it follows that
\begin{pmatrix}          1 & -V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}            1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}  =  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}
This result is useful in deriving the Sherman–Morrison–Woodbury matrix inverse formula/identity, otherwise known as the matrix inversion lemma.
One can now simply multiply the above equation by the corresponding inverse matrices to obtain the LDU decomposition of the matrix V. The inverse matrices are easy

\begin{pmatrix}          1 & V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}  \begin{pmatrix}          1 & -V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}   1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}  \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}  =  \begin{pmatrix}          1 &     V_{12}V_{22}^{-1} \\      0 & 1\\  \end{pmatrix}  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}   \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}
So finally
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}=    \begin{pmatrix}          1 &    V_{12}V_{22}^{-1}\\  0      & 1\\  \end{pmatrix}  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}   \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}
V=UDL
Notice how this is a UDL factorization. It is entirely possible to do things a little differently and result with the LDU decomposition
Consider right and left multiplying this matrix by
\begin{pmatrix}          1 & 0\\          -V_{21}V_{11}^{-1} & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}            1 & -V_{11}^{-1}V_{12} \\  0 & 1 \\  \end{pmatrix}  =  \begin{pmatrix}            V_{11} & 0 \\  0 & V_{22.1} \\  \end{pmatrix}
where
V_{22.1}=V_{22}-V_{21}V_{11}^{-1}V_{12}
i.e. the Schur complement of matrix V with respect to block 11. So finally
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}    =  \begin{pmatrix}          1 & 0\\          V_{21}V_{11}^{-1} & 1\\  \end{pmatrix}  \begin{pmatrix}            V_{11} & 0 \\  0 & V_{22.1} \\  \end{pmatrix}  \begin{pmatrix}            1 & V_{11}^{-1}V_{12} \\  0 & 1 \\  \end{pmatrix}
V=LDU