Chapter 2 Matrix Algebra, Linear Algebra 6e Lay
2.1 Operations with Matrices
- Addition and Subtraction
- Multiplication by a scalar
- Multiplication by another matrix
Addition and Subtraction
Q. What does it mean for two matrices to be equal?
A. It means they are the same size and have the EXACT same entries.
We can only add and subtract matrices that are the same size.
Q. How do we add matrices?
A. We add corresponding entries.
\[\begin{bmatrix} a & b\\ c & d \end{bmatrix} + \begin{bmatrix} e & f\\ g & h \end{bmatrix} = \begin{bmatrix} (a+e) & (b+f)\\ (c+g) & (d+h) \end{bmatrix}\]
What about standard addition properties? Matrix addition is:
- Commutative: \(A+B=B+A\)
- Associative: \((A+B)+C = A+(B+C)\)
Multiplication by a scalar
Multiply every entry in the matrix by the number.
Multiplication of two matrices
An \(n \times 1\) matrix multiplied by a \(1 \times n\) matrix is the \(1 \times 1\) matrix given by:
\[\begin{bmatrix} a_1 & a_2 & \cdots & a_n \end{bmatrix} \begin{bmatrix} b_1 \\ b_2 \\ \vdots\\ b_n \end{bmatrix} = \begin{bmatrix} a_1 b_1 + a_2 b_2 + \cdots + a_n b_n \end{bmatrix}\]
Larger Matrices
If \(A\) is an \(n \times p\) matrix and \(B\) is a \(p \times m\) matrix then the matrix product of \(A\) and \(B\), \(AB\), is an \(n \times m\) matrix whose \(i^{\text{th}}\) row and \(j^{\text{th}}\) column entry is the real number obtained from multiplying the \(i^{\text{th}}\) row of \(A\) by the \(j^{\text{th}}\) column of \(B\).
THE NUMBER OF COLUMNS OF \(A\) MUST BE THE SAME AS THE NUMBER OF ROWS OF \(B\).
The Transpose of a Matrix
2.2 The Inverse of a Matrix
Recall: The Identity Matrix is a square matrix with 1’s along the main diagonal and zeros everywhere else.
We call it the identity matrix because it behaves like 1 in multiplication.
Matrix Inverses
\(\begin{bmatrix} 2 & 3\\ 1 & 2 \end{bmatrix}\begin{bmatrix} 2 & -3\\ -1 & 2 \end{bmatrix} =\)
NOT ALL SQUARE MATRICES HAVE INVERSES. For example \(\begin{bmatrix} 2 & 1\\ 4 & 2 \end{bmatrix}\)
Q. How do we know if an inverse exists for \([A]\) and how do we find one if it does?
A. We perform Gauss Jordan Elimination on the augmented matrix \(\left[ ~ A~ | ~ I ~\right]\) until it looks like \(\left[ ~I~ | ~ A^{-1} ~\right]\)
If a matrix does NOT have an inverse we call it a singular matrix.
Row reduce this matrix:
\(\begin{bmatrix} 1 & 0 & | & 1 & 0\\ -3 & 1 & | & 0 & 1 \end{bmatrix}\)
Row reduce this matrix:
\(\begin{bmatrix} 1 & 5 & 10 & | & 1 & 0& 0\\ 0 & 1 & 4 & | & 0 & 1 & 0\\ 1 & 6 & 15& | & 0 & 0 & 1 \end{bmatrix}\)
Solving a matrix equation
Suppose we have a system of equations:
\[\begin{array}{rcl} a_1 x_1 + a_2 x_2 + a_3 x_3&=&a_4 \\ b_1 x_1 + b_2 x_2 + b_3 x_3&=&b_4 \\ c_1 x_1 + c_2 x_2 + c_3 x_3&=&c_4 \\ \end{array}\]
where \(a_i, ~b_i\), and \(c_i\) are real numbers and \(x_1, x_2, x_3\) are variables. Then we can write the coefficient matrix:
\[A= \begin{bmatrix} a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \\ c_1 & c_2 & c_3 \end{bmatrix}\]
and (IF it exists) we can find the inverse matrix \(A^{-1}\).
The original system can be written in matrix form:
\[\underbrace{\begin{bmatrix} a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \\ c_1 & c_2 & c_3 \end{bmatrix}}_{A} ~ \underbrace{\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}}_{X} = \underbrace{\begin{bmatrix} a_4\\ b_4\\ c_4 \end{bmatrix}}_{b}\]
and we end up with an equation of the form \(A X = b\). If this were an algebraic equation where \(A\) and \(b\) were numbers we could easily solve this by dividing on both sides by \(A\). WE CAN’T divide matrices.
What we can do with matrices is to multiply by the inverse of \(A\). Then we get something that looks like this:
\[\begin{align*} A X &= b\\ A^{-1} [A X] &= A^{-1} b\\ I X &= A^{-1} b\\ X &= A^{-1} b \end{align*}\]
The nice thing about solving an equation this way is that now we can easily solve many problems that have the same \(A\) but different \(b\) with one simple matrix multiplication.
Elementary Matrices
2.3 Characterizations of Invertible Matrices
This is an extremely important theorem that connects many concepts in linear algebra!
- If the equation \(Ax=0\) has only the trivial solution, then \(A\) is row equivalent to \(I_n\).
- If the columns of \(A\) span \(\mathbb{R}^n\), then the columns are linearly independent.
- If the equation \(Ax=0\) has a nontrivial solution, then \(A\) has fewer than \(n\) pivot positions.
- If \(A^T\) is not invertible, then \(A\) is not invertible.
- If there is an \(n \times n\) matrix \(D\) such that \(AD = I\), then there is also an \(n \times n\) matrix \(C\) such that \(CA=I\).
- If the columns of \(A\) are linearly independent, then the columns of \(A\) span \(\mathbb{R}^n\).
- If the equation \(Ax=b\) has at least one solution for each \(b \in \mathbb{R}^n\), then the solution is unique for each \(b\).