We've examined a lot of complexity classes so far, but we haven't exactly proved one very important assumption we've been making all along: that we can solve more problems if we're given more time or space. It's pretty obvious that for, say, $n^2$ and $n^3$, we can solve an $n^2$ time/space problem with $n^3$ time/space. However, we haven't shown that there is an $n^3$ time/space problem that can't be solved using $n^2$ time/space. Luckily, we will take care of that now.
A function $f : \mathbb N \to \mathbb N$, where $f(n) \geq O(\log n)$ is space constructible if the function that maps the string $1^n$ to the binary represenation of $f(n)$ is computable in space $O(f(n))$.
Most of the functions that you'd expect to run into are space constructible. For instance, $n^2$ is space constructible, since it counts $1^n$, computes $n$ in binary, and multiplies it to get $n \times n$, which uses $O(n)$ space, which is $O(n^2)$ as required. For another example, $\log_2 n$ is space constructible, since it just counts the number of $1$s in binary (which clearly uses $\log n$ space) and then it counts the number of bits in the binary represenation it just computed.
Why do we bother with the notion of constructibility? We want to ensure that our intuition that allowing more space necessarily means that we can solve more problems. If we don't require the constructibility property, then we can get things like the following result.
Theorem. (Borodin) Suppose $h$ is a recursive function such that $h(n) \geq n$. Then there exists an increasing recursive function $g$ such that $$\mathbf{SPACE}(g(n)) = \mathbf{SPACE}(h(g(n)).$$
This result is called the Gap theorem and it applies to time bounds as well. Essentially, this means that there's some function $g(n)$ where any $h(g(n))$ space bounded computation can be performed in $g(n)$ space; it doesn't matter if we allow $h(g(n))$ space, even if we had, say $h(n) = 2^n$. Naturally, this goes against our intuition. Constructibility imposes a reasonable restriction on the functions we consider.
It's probably a good idea now to review what $o(f(n))$ means. Let $f$ and $g$ be functions $f,g : \mathbb N \to \mathbb R$. We say that $f(n) = o(g(n))$ if $$ \lim_{n \to \infty} \frac{f(n)}{g(n)} = 0.$$ That is, $f(n) = o(g(n))$ means that for any real number $c > 0$, there exists $n_0$ such that $f(n) < c \cdot g(n)$ for all $n \geq n_0$.
Theorem. For any space constructible function $f:\mathbb N \to \mathbb N$, a language $A$ exists that is decidable in $O(f(n))$ space but not in $o(f(n))$ space.
Proof. We describe the following algorithm $D$ which uses $O(f(n))$ space to decide a language that is not decidable in $o(f(n))$ space.
Now, we need to verify how much space this algorithm uses. Since $D$ has a fixed alphabet and $M$ has some arbitrary alphabet, we incur some constant overhead to simulate the possibly larger alphabet. Thus, if $M$ uses $g(n)$ space, $D$ will use $d \cdot g(n)$ for some constant $d$ which depends on $M$.
We can guarantee that $D$ will halt by examining each step. Then $A$ must be decidable in $O(f(n))$ space since $D$ is.
Now we need to show that $A$ is not decidable in $o(f(n))$ space. Assume for contradiction that there is a Turing machine $M$ that decides $A$ in space $g(n)$ where $g(n) = o(f(n))$. Then $D$ can simulate $M$ with space $d \cdot g(n)$. Furthermore, there is some constant $n_0$ such that $d \cdot g(n) < f(n)$ for all $n \geq n_0$ by definition.
We examine a run of $D$ on $\langle M \rangle 10^{n_0}$. Since our input is of sufficient length, $D$ will simulate $M$ as in Step 4 and $D$ will do the opposite of $M$. But then $M$ doesn't decide $A$, which contradicts our assumption. Thus, $A$ is not decidable in $o(f(n))$ space. $$\tag*{$\Box$}$$
Corollary. For any two functions $f_1,f_2:\mathbb N \to \mathbb N$, where $f_1(n)$ is $o(f_2(n))$ and $f_2$ is space constructible, $\mathbf{SPACE}(f_1(n)) \subsetneq \mathbf{SPACE}(f_2(n))$.
Corollary. For any two real numbers $0 \leq \varepsilon_1 < \varepsilon_2$, $$\mathbf{SPACE}(n^{\varepsilon_1}) \subsetneq \mathbf{SPACE}(n^{\varepsilon_2}).$$
Corollary. $\mathbf{NL} \subsetneq \mathbf{PSPACE}$.
Proof. Savitch's theorem shows that $\mathbf{NL} \subseteq \mathbf{SPACE}(\log^2 n)$. The space hierarchy theorem shows that $\mathbf{SPACE}(\log^2 n) \subsetneq \mathbf{SPACE}(n)$. $$\tag*{$\Box$}$$
Corollary. $\mathbf{PSPACE} \subsetneq \mathbf{EXPSPACE}$.
Proof. By the space hierarchy theorem, we have $$\mathbf{SPACE}(n^k) \subsetneq \mathbf{SPACE}(n^{\log n}) \subsetneq \mathbf{SPACE}(2^n) \subseteq \mathbf{EXPSPACE} = \bigcup_{k \geq 0} \mathbf{SPACE}(2^{n^k}).$$ $$\tag*{$\Box$}$$