Chapter 1 Linear Equations in Linear Algebra, Linear Algebra 6e Lay
1.1 Systems of Linear Equations
There are 3 possibilities for the solution to a system of equations:
- One solution (consistent)
- No solution (inconsistent)
- \(\infty\) solutions (consistent)
When we solve systems of equations there are three things that can be done:
- Interchange two equations.
- Multiply an equation by a nonzero constant.
- Take a linear combination of two rows and replace either with the result.
The last item is sometimes stated as, “Add a multiple of one equation to another and replace either with the result.”
Question: What is a “linear combination”?
We don’t want to have to do these calculations with all these variables so we use Matrices.
An \(m \times n\) matrix is a rectangular array with \(m\) rows and \(n\) columns that looks like:
\[A = \left[ \begin{array}{cccc} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & \vdots \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \\ \end{array}\right]\]
Each entry \(a_{ij}\) is a complex number when working with equations but in principle can be anything. The advantage of writing the system of equations as a matrix is that we do not have to write all the variables every time. The first column only contains the coefficients of \(x_1\), the second column only contains the coefficients of \(x_2\) and the \(n^{\text{th}}\) column only contains the coefficients of \(x_n\).
A system of linear equations represented as a matrix would look like:
\[\left[ \begin{array}{cccc|c} a_{11} & a_{12} & \cdots & a_{1n} & b_1 \\ a_{21} & a_{22} & \cdots & \vdots & b_2 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} & b_m \\ \end{array}\right]\]
1.2 Row Reduced and Echelon Forms
Writing Solutions
- \[\left[ \begin{array}{cccc} 1 & 3 & 4 & 7 \\ 3 & 9 & 7 & 6 \\ \end{array}\right]\]
- \[\left[ \begin{array}{cccc} 1 & 0 & 2 & 5 \\ 0 & 1 & 5 & 2 \\ 0 & 0 & 0 & 0\\ \end{array}\right]\]
- \[\left[ \begin{array}{cccccc} 1 & -3 & 0 & -1 & 0 & -2\\ 0 & 1 & 0 & 0 & -4 & 1 \\ 0 & 0 & 0 & 1 & 9 & 4 \\ 0 & 0 & 0 & 0 & 0 & 0\\ \end{array}\right]\]
In some cases, a matrix may be row reduced to more that one matrix in reduced echelon form, using different sequences of row operations.
The row reduction algorithm applies only to augmented matrices for a linear system.
A basic variable in a linear system is a variable that corresponds to a pivot column in the coefficient matrix.
Finding a parametric description of the solution set of a linear system is the same as solving the system.
If one row in an echelon form of an augmented matrix is \([0 ~ 0 ~ 0 ~ 5 ~ 0]\), then the associated linear system is inconsistent.
The echelon form of a matrix is unique.
The pivot positions in a matrix depend on whether row interchanges are used in the row reduction process.
Reducing a matrix to echelon form is called the forward phase of the row reduction process.
Whenever a system has free variables, the solution set contains many solutions.
A general solution of a system is an explicit description of all solutions of the system.
1.3 Vector Equations
Notation:
\(\mathbb{R}\) is the real numbers
\(\mathbb{R}^2\) is \(\mathbb{R} \times \mathbb{R}\) the \(xy\)-plane
\(\mathbb{R}^3\) is 3D space.
A scalar multiple of vector \(\vec{v}\) is the vector \(c\vec{v}\) obtained by multiplying every element in vector \(\vec{v}\) by scalar \(c\). For example:
\[2 \cdot \vec{w}_1 =2 \cdot \left[ \begin{array}{c} 1 \\ -5 \end{array} \right] = \left[ \begin{array}{c} 2 \\ -10 \end{array} \right]\]
Adding vectors is done by adding the corresponding coordinates. For example:
\[\left[ \begin{array}{c} v_1 \\ v_2 \\ v_3 \\ \vdots \\ v_n\end{array}\right] + \left[ \begin{array}{c} w_1 \\ w_2 \\ w_3 \\ \vdots \\ w_n\end{array}\right] = \left[ \begin{array}{c} v_1 +w_1 \\ v_2 +w_2 \\ v_3+w_3 \\ \vdots \\ v_n+w_n\end{array}\right]\]
1.4 The Matrix Equation \(Ax=b\)
Three ways to write the system of equations \(Ax=b\)
- Write \(Ax = b\) explicitly in the the form
\[\left[ \begin{array}{ccl} a_{11} x_1 + a_{12} x_2 +a_{13} x_3 + \cdots + a_{1n} x_n & = & b_1\\ a_{21} x_1 + a_{22} x_2 +a_{23} x_3 + \cdots + a_{2n} x_n & = & b_2\\ \vdots & = & \vdots\\ a_{m1} x_1 + a_{m2} x_2 +a_{m3} x_3 + \cdots + a_{mn} x_n & = & b_m\\ \end{array}\right]\]
- Write as a vector equation (Linear combination of column vectors)
\[a_1 x_1 +a_2 x_2 + \cdots + a_n x_n = b\]
where \(a_i = \left[ \begin{array}{c} a_{1i} \\a_{2i} \\ \vdots \\a_{mi}\end{array}\right]\)
- Write as an augmented matrix
\[\left[\begin{array}{cccc|c} a_1 & a_2& \cdots & a_n & b \end{array}\right]\]
- How many rows of \(A\) contain a pivot position?
- Does the equation \(Ax=b\) have a solution for each \(b \in \mathbb{R}^3\)?
- Can each vector in \(\mathbb{R}^3\) be written as a linear combination of the columns of matrix \(A\)?
- Do the columns of \(A\) span \(\mathbb{R}^3\)?
1.5 Solution Sets of Linear Systems
Nonhomogeneous solutions
1.7 Linear Independence
Sometimes we talk about the linear independence of the matrix columns.
- The columns of a matrix \(A\) are linearly independent if the equation \(Ax=0\) has the trivial solution.
- If \(S\) is a linearly dependent set, then each vector is a linear combination of the other vectors in \(S\).
- The columns of a 4 × 5 matrix are linearly dependent.
- If \(x\) and \(y\) are linearly independent and if \(\{x, y, z\}\) is linearly dependent, then \(z\) is in Span\(\{x, y\}\).
- If \(v_1\) and \(v_2\) are in \(\mathbb{R}^4\) and \(v_2\) is not a scalar multiple of \(v_1\), then \(\{v_1, v_2\}\) is linearly independent.
- If \(v_1, \ldots, v_4\) are in \(\mathbb{R}^4\) and \(v_3\) is not a linear combination of \(v_1, ~v_2,~ v_4\), then \(\{v_1, v_2, v_3, v_4\}\) is linearly independent.
- If \(v_1, \ldots, v_4\) are linearly independent vectors in \(\mathbb{R}^4\), then \(\{v_1, v_2, v_3\}\) is also linearly independent.
\(\left\{ \begin{bmatrix} 5\\1\end{bmatrix}, \begin{bmatrix} 2\\8\end{bmatrix}, \begin{bmatrix}1\\3\end{bmatrix} \right\}\)
\(\left\{ \begin{bmatrix} 4\\-2\\6\end{bmatrix}, \begin{bmatrix}6\\-3\\9\end{bmatrix} \right\}\)
\(\left\{ \begin{bmatrix} 3\\5\\1\end{bmatrix}, \begin{bmatrix} 0\\0\\0\end{bmatrix}, \begin{bmatrix} 6\\5\\4\end{bmatrix} \right\}\)
1.8 Introduction to Linear Transformations
\(Ax=b\) is a matrix equation. We can also think of the matrix \(A\) as doing something to the vector \(x\). We say that \(A\) “acts” on \(x\) by multiplication. This produces a new vector \(Ax\).
\(T(0)=\)
\(T(c\mathbf{u}+d\mathbf{v})=\)