Diagonalize a Positive-Definite Symmetric Matrix using the Schur Complement and LDU Decomposition

Diagonalizing a matrix comes up frequently for me when wanting to diagonalize the variance matrix of a multivariate normal to derive conditional distributions. I prefer to proceed by doing an LDU Decomposition and leaving it in terms of the Schur complement as I find it easier to remember. Consider some matrix V and partition it as follows
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}=V
Consider right and left multiplying this matrix by
\begin{pmatrix}          1 & -V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}            1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}
The matrices really correspond to the L and U matrices in the LDU decomposition, noting that the matrix on the right or left is just the transpose of the other. Define the Schur complement of the matrix V with respect to the block V22 as
V_{11.2}=V_{11}-V_{12}V_{22}^{-1}V_{21}
then working through the above multiplication it follows that
\begin{pmatrix}          1 & -V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}            1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}  =  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}
This result is useful in deriving the Sherman–Morrison–Woodbury matrix inverse formula/identity, otherwise known as the matrix inversion lemma.
One can now simply multiply the above equation by the corresponding inverse matrices to obtain the LDU decomposition of the matrix V. The inverse matrices are easy

\begin{pmatrix}          1 & V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}  \begin{pmatrix}          1 & -V_{12}V_{22}^{-1}\\          0 & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}   1 & 0 \\  -V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}  \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}  =  \begin{pmatrix}          1 &     V_{12}V_{22}^{-1} \\      0 & 1\\  \end{pmatrix}  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}   \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}
So finally
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}=    \begin{pmatrix}          1 &    V_{12}V_{22}^{-1}\\  0      & 1\\  \end{pmatrix}  \begin{pmatrix}           V_{11.2} & 0 \\           0 & V_{22} \\   \end{pmatrix}   \begin{pmatrix}            1 & 0 \\  V_{22}^{-1}V_{21} & 1 \\  \end{pmatrix}
V=UDL
Notice how this is a UDL factorization. It is entirely possible to do things a little differently and result with the LDU decomposition
Consider right and left multiplying this matrix by
\begin{pmatrix}          1 & 0\\          -V_{21}V_{11}^{-1} & 1\\  \end{pmatrix}    \begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}  \begin{pmatrix}            1 & -V_{11}^{-1}V_{12} \\  0 & 1 \\  \end{pmatrix}  =  \begin{pmatrix}            V_{11} & 0 \\  0 & V_{22.1} \\  \end{pmatrix}
where
V_{22.1}=V_{22}-V_{21}V_{11}^{-1}V_{12}
i.e. the Schur complement of matrix V with respect to block 11. So finally
\begin{pmatrix}          V_{11} & V_{12}\\          V_{21} & V_{22}\\  \end{pmatrix}    =  \begin{pmatrix}          1 & 0\\          V_{21}V_{11}^{-1} & 1\\  \end{pmatrix}  \begin{pmatrix}            V_{11} & 0 \\  0 & V_{22.1} \\  \end{pmatrix}  \begin{pmatrix}            1 & V_{11}^{-1}V_{12} \\  0 & 1 \\  \end{pmatrix}
V=LDU