CAPP 30271 — Lecture 13

Dimension

The dimension of a vector space $V$ is the number of vectors in a basis for $V$.

It may seem a bit strange for this number to be based on just the number of vectors, since we can have any number of bases. But the special property here really is the number. This definition comes from the following result.

If $\mathbf u_1, \dots, \mathbf u_m$ and $\mathbf v_1, \dots, \mathbf v_n$ are bases for the same vector space, then $m = n$.

Suppose without loss of generality that there are more $\mathbf v_i$'s than $\mathbf u$'s, so $n \gt m$ (if it's the other way around, then the argument is the same but we just switch the names of the vectors the other way around).

Since all of these vectors belong to the same vector space and they are bases, we can write them as a linear combination of each other. So we can write \[\mathbf v_i = a_{1i} \mathbf u_1 + \cdots + a_{mi} \mathbf u_m\] for each vector $\mathbf v_1, \dots, \mathbf v_n$. We can take all of these vectors, make them columns of a matrix, and write it as the matrix product \[V = \begin{bmatrix} \mathbf v_1 & \dots & \mathbf v_n \end{bmatrix} = \begin{bmatrix} \mathbf u_1 & \dots & \mathbf u_m \end{bmatrix} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} = UA.\] What are the dimensions of our matrices? We know that the matrix $A$ which contains our coefficients is $m \times n$. But if $m \lt n$, then this matrix is not square and since there are fewer rows than columns, we know that $A \mathbf x = \mathbf 0$ has more than one solution.

Take one of these solutions $A \mathbf t = \mathbf 0$ with $\mathbf t \neq \mathbf 0$. Then \[V \mathbf t = UA \mathbf t = U(A \mathbf t) = U \mathbf 0 = \mathbf 0.\] This gives us a non-zero linear combination of columns of $V$ that results in zero, so the columns of $V$, the vectors $\mathbf v_1, \dots, \mathbf v_n$ are not linearly independent. But this is a contradiction—we started with the assumption that they were a basis, so they must have been linearly independent.

Therefore, it must have been the case that $m = n$ and our two bases must have been the same size.

This is really quite special: though we have infinitely many bases for a vector space, all the bases for a given vector space have the same size. So the definition says that the dimension of a vector space is not a "physical" quantity like we're used to imagining, but is really a descriptive quantity—how many vectors do I need to describe the vectors in a space?

Why does the dimension matter? It's one of the things that ties together the subspaces of a matrix. We've seen a number of subspaces associated with any given $m \times n$ matrix $A$ of rank $r$. There are the obvious ones:

We saw a not-as-obvious one:

But this leads to another possible subspace: what if we play the same trick as with the row space and think of the null space associated with $A^T$? This is called the left nullspace of $A$.

The left nullspace of an $m \times n$ matrix $A$ is the set of vectors that satisfy $A^T \mathbf y = \mathbf 0$. It is a subspace of $\mathbb R^m$.

We haven't yet discussed the dimensions for the nullspaces, but if you think back to our discussion of the null space, you may begin to see where we're headed.