Monday, August 18, 2014

How to calculate the determinant of a matrix

How to Compute Determinants:


 We're including it all here for completeness.
We define the determinant of a $ 2 \times 2$ matrix this way:

$\displaystyle det
\left(\begin{array}{cc} a & b \\  c & d \\
\end{array}\righ...
...left\vert\begin{array}{cc} a & b \\  c & d \\
\end{array}\right\vert = ad-bc.$

Then we use our definition of the determinant of a $ 2 \times 2$ matrix to define the determinant of a $ 3 \times 3$ matrix:

$\displaystyle det(A) = det
\left(\begin{array}{ccc} a & b & c \cr d & e & f \cr...
...t + c
\left\vert\begin{array}{cc} d & e \cr g & h \cr
\end{array}\right\vert .$

In other words, we go across the first row of the matrix $ A$$ (a \, b \, c)$. We multiply each entry by the determinant of the $ 2 \times 2$ matrix we get from $ A$ by crossing out the row and column containing that entry. (Try this. If you take $ A$ and cross out the row and column containing $ b$ (the first row and the second column) you get the matrix $ \left(\begin{array}{cc} d & f \cr g & k \cr
\end{array}\right) $; this was the matrix whose determinant we multiplied $ b$ by in computing the determinant of $ A$.) Then we add and subtract the resulting terms, alternating signs (add the $ a$-term, subtract the $ b$-term, add the $ c$-term.)