### SectionCRSColumn and Row Spaces

A matrix-vector product (Definition MVP) is a linear combination of the columns of the matrix and this allows us to connect matrix multiplication with systems of equations via Theorem SLSLC. Row operations are linear combinations of the rows of a matrix, and of course, reduced row-echelon form (Definition RREF) is also intimately related to solving systems of equations. In this section we will formalize these ideas with two key definitions of sets of vectors derived from a matrix.

#### SubsectionCSSEColumn Spaces and Systems of Equations

Theorem SLSLC showed us that there is a natural correspondence between solutions to linear systems and linear combinations of the columns of the coefficient matrix. This idea motivates the following important definition.

##### Definition CSM Column Space of a Matrix

Suppose that $A$ is an $m\times n$ matrix with columns $\vectorlist{A}{n}$. Then the column space of $A$, written $\csp{A}$, is the subset of $\complex{m}$ containing all linear combinations of the columns of $A$, \begin{equation*} \csp{A}=\spn{\set{\vectorlist{A}{n}}} \end{equation*}

Some authors refer to the column space of a matrix as the range, but we will reserve this term for use with linear transformations (Definition RLT).

Upon encountering any new set, the first question we ask is what objects are in the set, and which objects are not? Here is an example of one way to answer this question, and it will motivate a theorem that will then answer the question precisely.

##### Example CSMCSColumn space of a matrix and consistent systems

So if we fix the coefficient matrix, and vary the vector of constants, we can sometimes find consistent systems, and sometimes inconsistent systems. The vectors of constants that lead to consistent systems are exactly the elements of the column space. This is the content of the next theorem, and since it is an equivalence, it provides an alternate view of the column space.

##### Theorem CSCS Column Spaces and Consistent Systems

Suppose $A$ is an $m\times n$ matrix and $\vect{b}$ is a vector of size $m$. Then $\vect{b}\in\csp{A}$ if and only if $\linearsystem{A}{\vect{b}}$ is consistent.

This theorem tells us that asking if the system $\linearsystem{A}{\vect{b}}$ is consistent is exactly the same question as asking if $\vect{b}$ is in the column space of $A$. Or equivalently, it tells us that the column space of the matrix $A$ is precisely those vectors of constants, $\vect{b}$, that can be paired with $A$ to create a system of linear equations $\linearsystem{A}{\vect{b}}$ that is consistent.

Employing Theorem SLEMM we can form the chain of equivalences \begin{align*} \vect{b}\in\csp{A} \iff \linearsystem{A}{\vect{b}}\text{ is consistent} \iff A\vect{x}=\vect{b}\text{ for some }\vect{x} \end{align*}

Thus, an alternative (and popular) definition of the column space of an $m\times n$ matrix $A$ is \begin{align*} \csp{A}&= \setparts{ \vect{y}\in\complex{m} }{ \vect{y}=A\vect{x}\text{ for some }\vect{x}\in\complex{n} } = \setparts{A\vect{x}}{\vect{x}\in\complex{n}} \subseteq\complex{m} \end{align*}

We recognize this as saying create all the matrix vector products possible with the matrix $A$ by letting $\vect{x}$ range over all of the possibilities. By Definition MVP we see that this means take all possible linear combinations of the columns of $A$ — precisely the definition of the column space (Definition CSM) we have chosen.

Notice how this formulation of the column space looks very much like the definition of the null space of a matrix (Definition NSM), but for a rectangular matrix the column vectors of $\csp{A}$ and $\nsp{A}$ have different sizes, so the sets are very different.

Given a vector $\vect{b}$ and a matrix $A$ it is now very mechanical to test if $\vect{b}\in\csp{A}$. Form the linear system $\linearsystem{A}{\vect{b}}$, row-reduce the augmented matrix, $\augmented{A}{\vect{b}}$, and test for consistency with Theorem RCLS. Here is an example of this procedure.

##### Sage CSCSColumn Space and Consistent Systems

Theorem CSCS completes a collection of three theorems, and one definition, that deserve comment. Many questions about spans, linear independence, null space, column spaces and similar objects can be converted to questions about systems of equations (homogeneous or not), which we understand well from our previous results, especially those in Chapter SLE. These previous results include theorems like Theorem RCLS which allows us to quickly decide consistency of a system, and Theorem BNS which allows us to describe solution sets for homogeneous systems compactly as the span of a linearly independent set of column vectors.

The table below lists these four definitions and theorems along with a brief reminder of the statement and an example of how the statement is used.

 Definition NSM Synopsis Null space is solution set of homogeneous system Example General solution sets described by Theorem PSPHS Theorem SLSLC Synopsis Solutions for linear combinations with unknown scalars Example Deciding membership in spans Theorem SLEMM Synopsis System of equations represented by matrix-vector product Example Solution to $\linearsystem{A}{\vect{b}}$ is $\inverse{A}\vect{b}$ when $A$ is nonsingular Theorem CSCS Synopsis Column space vectors create consistent systems Example Deciding membership in column spaces

#### SubsectionCSSOCColumn Space Spanned by Original Columns

So we have a foolproof, automated procedure for determining membership in $\csp{A}$. While this works just fine a vector at a time, we would like to have a more useful description of the set $\csp{A}$ as a whole. The next example will preview the first of two fundamental results about the column space of a matrix.

##### Example CSTWColumn space, two ways

We will now formalize the previous example, which will make it trivial to determine a linearly independent set of vectors that will span the column space of a matrix, and is constituted of just columns of $A$.

##### Theorem BCS Basis of the Column Space

Suppose that $A$ is an $m\times n$ matrix with columns $\vectorlist{A}{n}$, and $B$ is a row-equivalent matrix in reduced row-echelon form with $r$ pivot columns. Let $D=\{d_1,\,d_2,\,d_3,\,\ldots,\,d_r\}$ be the set of indices for the pivot columns of $B$ Let $T=\set{\vect{A}_{d_1},\,\vect{A}_{d_2},\,\vect{A}_{d_3},\,\ldots,\,\vect{A}_{d_r}}$. Then

1. $T$ is a linearly independent set.
2. $\csp{A}=\spn{T}$.

This is a nice result since it gives us a handful of vectors that describe the entire column space (through the span), and we believe this set is as small as possible because we cannot create any more relations of linear dependence to trim it down further. Furthermore, we defined the column space (Definition CSM) as all linear combinations of the columns of the matrix, and the elements of the set $T$ are still columns of the matrix (we will not be so lucky in the next two constructions of the column space).

Procedurally this theorem is extremely easy to apply. Row-reduce the original matrix, identify $r$ pivot columns the reduced matrix, and grab the columns of the original matrix with the same indices as the pivot columns. But it is still important to study the proof of Theorem BS and its motivation in Example COV which lie at the root of this theorem. We will trot through an example all the same.

#### SubsectionCSNMColumn Space of a Nonsingular Matrix

Let us specialize to square matrices and contrast the column spaces of the coefficient matrices in Archetype A and Archetype B.

##### Example CSABColumn space of Archetype B

Example CSAA and Example CSAB together motivate the following equivalence, which says that nonsingular matrices have column spaces that are as big as possible.

##### Theorem CSNM Column Space of a Nonsingular Matrix

Suppose $A$ is a square matrix of size $n$. Then $A$ is nonsingular if and only if $\csp{A}=\complex{n}$.

With this equivalence for nonsingular matrices we can update our list, Theorem NME3.

##### Theorem NME4 Nonsingular Matrix Equivalences, Round 4

Suppose that $A$ is a square matrix of size $n$. The following are equivalent.

1. $A$ is nonsingular.
2. $A$ row-reduces to the identity matrix.
3. The null space of $A$ contains only the zero vector, $\nsp{A}=\set{\zerovector}$.
4. The linear system $\linearsystem{A}{\vect{b}}$ has a unique solution for every possible choice of $\vect{b}$.
5. The columns of $A$ are a linearly independent set.
6. $A$ is invertible.
7. The column space of $A$ is $\complex{n}$, $\csp{A}=\complex{n}$.

#### SubsectionRSMRow Space of a Matrix

The rows of a matrix can be viewed as vectors, since they are just lists of numbers, arranged horizontally. So we will transpose a matrix, turning rows into columns, so we can then manipulate rows as column vectors. As a result we will be able to make some new connections between row operations and solutions to systems of equations. OK, here is the second primary definition of this section.

##### Definition RSM Row Space of a Matrix

Suppose $A$ is an $m\times n$ matrix. Then the row space of $A$, $\rsp{A}$, is the column space of $\transpose{A}$, i.e. $\rsp{A}=\csp{\transpose{A}}$.

Informally, the row space is the set of all linear combinations of the rows of $A$. However, we write the rows as column vectors, thus the necessity of using the transpose to make the rows into columns. Additionally, with the row space defined in terms of the column space, all of the previous results of this section can be applied to row spaces.

Notice that if $A$ is a rectangular $m\times n$ matrix, then $\csp{A}\subseteq\complex{m}$, while $\rsp{A}\subseteq\complex{n}$ and the two sets are not comparable since they do not even hold objects of the same type. However, when $A$ is square of size $n$, both $\csp{A}$ and $\rsp{A}$ are subsets of $\complex{n}$, though usually the sets will not be equal (but see Exercise CRS.M20).

##### Example RSAIRow space of Archetype I

The row space would not be too interesting if it was simply the column space of the transpose. However, when we do row operations on a matrix we have no effect on the many linear combinations that can be formed with the rows of the matrix. This is stated more carefully in the following theorem.

##### Theorem REMRS Row-Equivalent Matrices have equal Row Spaces

Suppose $A$ and $B$ are row-equivalent matrices. Then $\rsp{A}=\rsp{B}$.

##### Example RSREMRow spaces of two row-equivalent matrices

Theorem REMRS is at its best when one of the row-equivalent matrices is in reduced row-echelon form. The vectors that are zero rows can be ignored. (Who needs the zero vector when building a span? See Exercise LI.T10.) The echelon pattern insures that the nonzero rows yield vectors that are linearly independent. Here is the theorem.

##### Theorem BRS Basis for the Row Space

Suppose that $A$ is a matrix and $B$ is a row-equivalent matrix in reduced row-echelon form. Let $S$ be the set of nonzero columns of $\transpose{B}$. Then

1. $\rsp{A}=\spn{S}$.
2. $S$ is a linearly independent set.

##### Example IASImproving a span

Notice that in Example IAS all we had to do was row-reduce the right matrix and toss out a zero row. Next to row operations themselves, Theorem BRS is probably the most powerful computational technique at your disposal as it quickly provides a much improved description of a span, any span (row space, column space, …).

Theorem BRS and the techniques of Example IAS will provide yet another description of the column space of a matrix. First we state a triviality as a theorem, so we can reference it later.

##### Theorem CSRST Column Space, Row Space, Transpose

Suppose $A$ is a matrix. Then $\csp{A}=\rsp{\transpose{A}}$.

So to find another expression for the column space of a matrix, build its transpose, row-reduce it, toss out the zero rows, and convert the nonzero rows to column vectors to yield an improved set for the span construction. We will do Archetype I, then you do Archetype J.