Chapter 2 Matrix Algebra, Linear Algebra 6e Lay

Author

Chalmeta

2.1 Operations with Matrices

  • Addition and Subtraction
  • Multiplication by a scalar
  • Multiplication by another matrix

Addition and Subtraction

Q. What does it mean for two matrices to be equal?

A. It means they are the same size and have the EXACT same entries.

We can only add and subtract matrices that are the same size.

Q. How do we add matrices?

A. We add corresponding entries.

\[\begin{bmatrix} a & b\\ c & d \end{bmatrix} + \begin{bmatrix} e & f\\ g & h \end{bmatrix} = \begin{bmatrix} (a+e) & (b+f)\\ (c+g) & (d+h) \end{bmatrix}\]

TipExample 1

\[\begin{bmatrix} 3 & 2\\ -1 & -1\\ 0 & 3 \end{bmatrix} + \begin{bmatrix} -2 & 3\\ 1 & -1\\ 2 & -2 \end{bmatrix} =\]

What about standard addition properties? Matrix addition is:

  1. Commutative: \(A+B=B+A\)
  2. Associative: \((A+B)+C = A+(B+C)\)
NoteDefinition

The Zero Matrix is a matrix with all entries zero. We often use 0 to represent it.

TipExample 2

\[\begin{bmatrix} 1 & 2\\ 3 & 4 \end{bmatrix} + 0 = \begin{bmatrix} 1 & 2\\ 3 & 4 \end{bmatrix}\]

\[\begin{bmatrix} 1 & 2 & 3 & 4 \end{bmatrix} + 0 = \begin{bmatrix} 1 & 2 & 3 & 4 \end{bmatrix}\]

Multiplication by a scalar

Multiply every entry in the matrix by the number.

TipExample 3

\[5 \begin{bmatrix} -7 & 3 & 0 & 9\\ 4 & -5 & 6 & 2 \end{bmatrix} = \begin{bmatrix} 5(-7) & 5(3) & 5(0) & 5(9)\\ 5(4) & 5(-5) & 5(6) & 5(2) \end{bmatrix}\]

TipExample 4

\(A= \begin{bmatrix} 3 & 2 & 0\\ -1 & 4 & -6 \end{bmatrix}\) and \(B= \begin{bmatrix} 5 & -2 & 7\\ 1 & 3 & -9 \end{bmatrix}\)

Find \(-2A -B\) and \(4B - A\)

Multiplication of two matrices

An \(n \times 1\) matrix multiplied by a \(1 \times n\) matrix is the \(1 \times 1\) matrix given by:

\[\begin{bmatrix} a_1 & a_2 & \cdots & a_n \end{bmatrix} \begin{bmatrix} b_1 \\ b_2 \\ \vdots\\ b_n \end{bmatrix} = \begin{bmatrix} a_1 b_1 + a_2 b_2 + \cdots + a_n b_n \end{bmatrix}\]

TipExample 5

\[\begin{bmatrix} -1 & 0 & 3 & 2 \end{bmatrix} \begin{bmatrix} 2\\ 3\\ 4\\ -1 \end{bmatrix} = \begin{bmatrix} (-1)(2) + (0)(3) + (3)(4) +(2)(-1) \end{bmatrix}\]

Larger Matrices

If \(A\) is an \(n \times p\) matrix and \(B\) is a \(p \times m\) matrix then the matrix product of \(A\) and \(B\), \(AB\), is an \(n \times m\) matrix whose \(i^{\text{th}}\) row and \(j^{\text{th}}\) column entry is the real number obtained from multiplying the \(i^{\text{th}}\) row of \(A\) by the \(j^{\text{th}}\) column of \(B\).

THE NUMBER OF COLUMNS OF \(A\) MUST BE THE SAME AS THE NUMBER OF ROWS OF \(B\).

TipExample 6

\[\begin{bmatrix} -1 & 1\\ 2 & 3\\ 1 & 0 \end{bmatrix} \begin{bmatrix} -1 & 0 & 3 & -2 \\ 1 & 2 & 2 & 0 \end{bmatrix}\]

\[= \begin{bmatrix} \begin{bmatrix} -1 & 1 \end{bmatrix}\begin{bmatrix} -1 \\ 1 \end{bmatrix} & \begin{bmatrix} -1 & 1 \end{bmatrix}\begin{bmatrix} 0 \\ 2 \end{bmatrix} & \begin{bmatrix} -1 & 1 \end{bmatrix}\begin{bmatrix} 3 \\ 2 \end{bmatrix} & \begin{bmatrix} -1 & 1 \end{bmatrix}\begin{bmatrix} -2 \\ 0 \end{bmatrix} \\ \\ \begin{bmatrix} 2 & 3 \end{bmatrix}\begin{bmatrix} -1 \\ 1 \end{bmatrix} & \begin{bmatrix} 2 & 3 \end{bmatrix}\begin{bmatrix} 0 \\ 2 \end{bmatrix} & \begin{bmatrix} 2 & 3 \end{bmatrix}\begin{bmatrix} 3 \\ 2 \end{bmatrix} & \begin{bmatrix} 2 & 3 \end{bmatrix}\begin{bmatrix} -2 \\ 0 \end{bmatrix}\\ \\ \begin{bmatrix} 1 & 0 \end{bmatrix}\begin{bmatrix} -1 \\ 1 \end{bmatrix} & \begin{bmatrix} 1 & 0 \end{bmatrix}\begin{bmatrix} 0 \\ 2 \end{bmatrix} & \begin{bmatrix} 1 & 0 \end{bmatrix}\begin{bmatrix} 3 \\ 2 \end{bmatrix} & \begin{bmatrix} 1 & 0 \end{bmatrix}\begin{bmatrix} -2 \\ 0 \end{bmatrix} \end{bmatrix}\]

TipExample 7

\(A = \begin{bmatrix} 3 & 2 & 0\\ -1 & 4 & -6 \end{bmatrix}\), \(B= \begin{bmatrix} 5 & -2\\ 1 & 3 \end{bmatrix}\)

Find \(BA\) and \(AB\)

NoteProperties of Matrix Multiplication
NoteTheorem

Let \(A\) be an \(m \times n\) matrix, and let \(B\) and \(C\) have sizes for which the indicated sums and products are defined.

  1. \(A(BC) = (AB)C\) (associative law of multiplication)

  2. \(A(B+C) = AB+AC\) (left distributive law)

  3. \((B+C)A = BA + CA\) (right distributive law)

  4. \(r(AB) = (rA)B = A(rB)\) for any scalar \(r\)

  5. \(I_m A = A = A I_n\) (Identity for matrix multiplication)

TipExample 8

\(A = \begin{bmatrix} 2 & -3\\ -4 & 6 \end{bmatrix}\), \(B= \begin{bmatrix} 8 & 4\\ 5 & 5 \end{bmatrix}\), \(C= \begin{bmatrix} 5 & -2\\ 3 & 1 \end{bmatrix}\), \(D= \begin{bmatrix} 15 & 9\\ 10 & 6 \end{bmatrix}\)

Find \(AC\), \(AB\), \(BA\) and \(AD\)

TipExample 9

Let \(A = \begin{bmatrix} 4 & -2 \\ 1 & 6 \end{bmatrix}\), \(B = \begin{bmatrix} 2 & 0 \\ 5 & -3 \end{bmatrix}\), \(C = \begin{bmatrix} -2 & 3 \\ 0 & 1 \end{bmatrix}\)

Verify that \(AB \neq BA\) and \(A(BC) = (AB)C\)

The Transpose of a Matrix

NoteDefinition

Given an \(m \times n\) matrix \(A\), the transpose of \(A\) is the \(n \times m\) matrix, denoted by \(A^T\), whose columns are formed from the corresponding rows of \(A\).

NoteProperties of Matrix Transpose
NoteTheorem

Let \(A\) and \(B\) have sizes for which the indicated sums and products are defined.

  1. \((A^T)^T = A\)

  2. \((A+B)^T=A^T +B^T\)

  3. For any scalar \(r\), \((rA)^T = rA^T\)

  4. \((AB)^T = B^TA^T\)

TipExample 10

\(A = \begin{bmatrix} 1 & -3\\ -2 & 4 \end{bmatrix}\), \(x= \begin{bmatrix} 5 \\ 3 \end{bmatrix}\), \(B = \begin{bmatrix} 1 & -3 & 7\\ -2 & 4 & -2 \end{bmatrix}\)

Find \((Ax)^T\), \(x^TA^T\), \(xx^T\), and \(x^Tx\).

Can you calculate \(A^Tx^T\), \((AB)^T\), \(A^TB^T\), \(B^TA^T\) and, if so, what is the result?

2.2 The Inverse of a Matrix

Recall: The Identity Matrix is a square matrix with 1’s along the main diagonal and zeros everywhere else.

TipExample 13

\[I = \begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix}, \qquad I = \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1 \end{bmatrix}\]

We call it the identity matrix because it behaves like 1 in multiplication.

TipExample 14

\[\begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix} \begin{bmatrix} a & b\\ c & d \end{bmatrix} = \begin{bmatrix} a & b\\ c & d \end{bmatrix} \qquad \begin{bmatrix} a & b\\ c & d \end{bmatrix}\begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix} = \begin{bmatrix} a & b\\ c & d \end{bmatrix}\]

TipExample 15

\[\begin{bmatrix} a & b & c\\ d & e & f \end{bmatrix}\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1 \end{bmatrix} = \begin{bmatrix} a & b & c\\ d & e & f \end{bmatrix}\]

Matrix Inverses

NoteDefinition

If \(M\) is a square matrix and if there exists \(M^{-1}\) such that \[MM^{-1} = I \text{ and } M^{-1}M = I\] then \(M^{-1}\) is the Multiplicative Inverse of \(M\). We often simply call it “The Inverse” of \(M\).

TipExample 16

\(A =\begin{bmatrix} 2 & 3\\ 1 & 2 \end{bmatrix}\) and \(A^{-1} =\begin{bmatrix} 2 & -3\\ -1 & 2 \end{bmatrix}\). Show that these are inverses of each other.

\(\begin{bmatrix} 2 & 3\\ 1 & 2 \end{bmatrix}\begin{bmatrix} 2 & -3\\ -1 & 2 \end{bmatrix} =\)

NOT ALL SQUARE MATRICES HAVE INVERSES. For example \(\begin{bmatrix} 2 & 1\\ 4 & 2 \end{bmatrix}\)

Q. How do we know if an inverse exists for \([A]\) and how do we find one if it does?

A. We perform Gauss Jordan Elimination on the augmented matrix \(\left[ ~ A~ | ~ I ~\right]\) until it looks like \(\left[ ~I~ | ~ A^{-1} ~\right]\)

If a matrix does NOT have an inverse we call it a singular matrix.

TipExample 17

\(A =\begin{bmatrix} 1 & 0\\ -3 & 1 \end{bmatrix}\) Find \(A^{-1}\)

Row reduce this matrix:

\(\begin{bmatrix} 1 & 0 & | & 1 & 0\\ -3 & 1 & | & 0 & 1 \end{bmatrix}\)

Note2 × 2 Matrix Inverse
NoteTheorem

Let \(A = \begin{bmatrix} a & b\\ c & d \end{bmatrix}\) and define the determinant of \(A\) as \(\det A = ad-bc\). If \(\det A \neq 0\) then \(A\) is invertible and \[A^{-1} = \frac{1}{\det A} \begin{bmatrix} d & -b\\ -c & a \end{bmatrix}\]

If \(\det A=0\) the matrix is not invertible.

TipExample 18

Find the inverse of \(A = \begin{bmatrix} 3 & 9\\ 2 & 6 \end{bmatrix}\)

TipExample 19

\(A =\begin{bmatrix} 1 & 5 & 10\\ 0 & 1 & 4\\ 1 & 6 & 15 \end{bmatrix}\) Find \(A^{-1}\)

Row reduce this matrix:

\(\begin{bmatrix} 1 & 5 & 10 & | & 1 & 0& 0\\ 0 & 1 & 4 & | & 0 & 1 & 0\\ 1 & 6 & 15& | & 0 & 0 & 1 \end{bmatrix}\)

TipExample 20

\(M =\begin{bmatrix} 3 & -1 & 1\\ -1 & 1 & 0\\ 1 & 0 & 1 \end{bmatrix}\) Find \(M^{-1} =\begin{bmatrix} 1 & 1 & -1\\ 1 & 2 & -1\\ -1 & -1 & 2 \end{bmatrix}\)

Solving a matrix equation

Suppose we have a system of equations:

\[\begin{array}{rcl} a_1 x_1 + a_2 x_2 + a_3 x_3&=&a_4 \\ b_1 x_1 + b_2 x_2 + b_3 x_3&=&b_4 \\ c_1 x_1 + c_2 x_2 + c_3 x_3&=&c_4 \\ \end{array}\]

where \(a_i, ~b_i\), and \(c_i\) are real numbers and \(x_1, x_2, x_3\) are variables. Then we can write the coefficient matrix:

\[A= \begin{bmatrix} a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \\ c_1 & c_2 & c_3 \end{bmatrix}\]

and (IF it exists) we can find the inverse matrix \(A^{-1}\).

The original system can be written in matrix form:

\[\underbrace{\begin{bmatrix} a_1 & a_2 & a_3 \\ b_1 & b_2 & b_3 \\ c_1 & c_2 & c_3 \end{bmatrix}}_{A} ~ \underbrace{\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}}_{X} = \underbrace{\begin{bmatrix} a_4\\ b_4\\ c_4 \end{bmatrix}}_{b}\]

and we end up with an equation of the form \(A X = b\). If this were an algebraic equation where \(A\) and \(b\) were numbers we could easily solve this by dividing on both sides by \(A\). WE CAN’T divide matrices.

What we can do with matrices is to multiply by the inverse of \(A\). Then we get something that looks like this:

\[\begin{align*} A X &= b\\ A^{-1} [A X] &= A^{-1} b\\ I X &= A^{-1} b\\ X &= A^{-1} b \end{align*}\]

The nice thing about solving an equation this way is that now we can easily solve many problems that have the same \(A\) but different \(b\) with one simple matrix multiplication.

TipExample 21

Solve \[\begin{array}{rcl} 3x_1 - x_2 + x_3&=&1 \\ -x_1 + x_2&=&1 \\ x_1 + x_3 &=&1 \end{array}\]

by writing the equation in matrix form as \(A X = b\) and multiplying by \(A^{-1}\).

TipExample 22

Solve \[\begin{array}{rcl} 3x_1 - x_2 + x_3&=&1 \\ -x_1 + x_2&=&2 \\ x_1 + x_3 &=&-3 \end{array}\]

NoteProperties of Invertible Matrices
NoteTheorem

Properties of Invertible Matrices:

  1. If \(A\) is an invertible matrix, then \(A^{-1}\) is invertible and \[(A^{-1})^{-1} = A\]

  2. If \(A\) and \(B\) are \(n \times n\) invertible matrices, then so is \(AB\), and the inverse of \(AB\) is the product of the inverses of \(A\) and \(B\) in reverse order: \[(AB)^{-1} = B^{-1}A^{-1}\]

  3. If \(A\) is an invertible matrix, then so is \(A^T\), and the inverse of \(A^T\) is the transpose of \(A^{-1}\): \[(A^T)^{-1} = (A^{-1})^T\]

Elementary Matrices

NoteDefinition

An elementary matrix is one that is obtained by performing a single elementary row operation on an identity matrix.

TipExample 23

Find the product \(E_1A\) and \(E_2A\) and identify the corresponding row operation where:

\[E_1 = \begin{bmatrix} 0 & 0 & 1\\ 0 & 1 & 0\\ 1 & 0 & 0 \end{bmatrix}, \quad E_2 = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 4\\ 0 & 0 & 1 \end{bmatrix}, \quad \text{and} \quad A = \begin{bmatrix} a & b & c \\ d & e & f \\ g & h & i \end{bmatrix}\]

NoteInvertible matrices are row equivalent to \(I_n\)
NoteTheorem

An \(n \times n\) matrix \(A\) is invertible if and only if \(A\) is row equivalent to \(I_n\), and in this case, any sequence of elementary row operations that reduces \(A\) to \(I_n\) also transforms \(I_n\) into \(A^{-1}\).

2.3 Characterizations of Invertible Matrices

NoteThe Invertible Matrix Theorem
NoteTheorem

Let \(A\) be a square \(n \times n\) matrix. Then the following statements are equivalent. That is, for a given \(A\), the statements are either all true or all false.

  1. \(A\) is an invertible matrix.

  2. \(A\) is row equivalent to the \(n \times n\) identity matrix.

  3. \(A\) has \(n\) pivot positions.

  4. The equation \(Ax = 0\) has only the trivial solution.

  5. The columns of \(A\) form a linearly independent set.

  6. The linear transformation \(x \mapsto Ax\) is one-to-one.

  7. The equation \(Ax = b\) has at least one solution for each \(b\) in \(\mathbb{R}^n\).

  8. The columns of \(A\) span \(\mathbb{R}^n\).

  9. The linear transformation \(x \mapsto Ax\) maps \(\mathbb{R}^n\) onto \(\mathbb{R}^n\).

  10. There is an \(n \times n\) matrix \(C\) such that \(CA = I\).

  11. There is an \(n \times n\) matrix \(D\) such that \(AD = I\).

  12. \(A^T\) is an invertible matrix.

This is an extremely important theorem that connects many concepts in linear algebra!

TipExample 24

All matrices in this example are \(n \times n\). Each part of the example is an implication of the form If “statement 1” then “statement 2”. The implication is TRUE if “statement 2” is ALWAYS true whenever “statement 1” happens. If there is a time when it is not true then it is FALSE.

  1. If the equation \(Ax=0\) has only the trivial solution, then \(A\) is row equivalent to \(I_n\).
  1. If the columns of \(A\) span \(\mathbb{R}^n\), then the columns are linearly independent.
  1. If the equation \(Ax=0\) has a nontrivial solution, then \(A\) has fewer than \(n\) pivot positions.
  1. If \(A^T\) is not invertible, then \(A\) is not invertible.
  1. If there is an \(n \times n\) matrix \(D\) such that \(AD = I\), then there is also an \(n \times n\) matrix \(C\) such that \(CA=I\).
  1. If the columns of \(A\) are linearly independent, then the columns of \(A\) span \(\mathbb{R}^n\).
  1. If the equation \(Ax=b\) has at least one solution for each \(b \in \mathbb{R}^n\), then the solution is unique for each \(b\).
NoteInvertible Transformations
NoteTheorem

Let \(T: \mathbb{R}^n \rightarrow \mathbb{R}^n\) be a linear transformation and let \(A\) be the standard matrix for \(T\). Then \(T\) is invertible if and only if \(A\) is an invertible matrix. In that case, the linear transformation \(S\) given by \(S(x) = A^{-1} x\) is the unique function satisfying \(T(S(x)) = x\) and \(S(T(x)) = x\).

TipExample 25

\(T\) is a linear transformation from \(\mathbb{R}^2\) into \(\mathbb{R}^2\). Show that \(T\) is invertible and find a formula for \(T^{-1}\). \[T(x_1,x_2) = (-5x_1 +9x_2, 4x_1 -7x_2)\]