Big Θ
Now we'll start to introduce some notation that lets us succinctly talk about the asymptotic relationships between functions.
Soon we'll get back to your old friend Big O.
But first, let's meet… Big \( \Theta \)!
(\( \Theta \) is the capital form of the Greek letter \( \theta \), so we pronounce it big theta.)
For a function \( g(n) \), \( \Theta(g(n)) \) is a set of functions—all the functions \( f \) that grow at a similar rate to \( g \) as \( n \to \infty \).
More specifically,
$$ f(n) \in \Theta(g(n)) \quad \textrm{if and only if} \quad 0 < \lim_{n \to \infty} \frac{f(n)}{g(n)} < \infty. $$
Writing things like “\( {} < \infty \)” is why mathematicians won't sit with computer scientists at lunch.
You can be excused from writing it like this, but we'll need a note from your calculus professor.
(No, but seriously, it is an abuse of notation, but we think it makes the meaning clear, so we'll stick with it.)
Example of Using Big Theta
\( \Theta \) is a way to say that two functions grow similarly in the limit.
Let's see an example of using it to compare two functions.
Remember our "lower half" function?
bool lowerHalf(float x, const float* arr, size_t arrSize) { // 1
size*t numLessThan = 0; // 2
for (size*t i = 0; i < arrSize; ++i) { // 3
if (arr[i] < x) { // 4
++numLessThan; // 5
} // 6
} // 7
return numLessThan < arrSize/2.0; // 8
} // 9
We found that
- The number of less-than comparisons, \( \mathrm{C}_{\mathrm{lessthan}}(n) = 2n + 2 \), where \( n \) is the size of the array.
- The number of index operations, \( \mathrm{C}_{\mathrm{index}}(n) = n \), where \( n \) is the size of the array.
- The number of increment operations is
- At least \( \mathrm{C}_{\mathrm{best-inc}} = n \), and
- At most \( \mathrm{C}_{\mathrm{worst-inc}}(n) = 2n \).
We can determine the answer quickly with our ratio limit.
Because \( \lim_{n \to \infty} \frac{ \mathrm{C}_{\mathrm{lessthan}}(n) }{ \mathrm{C}_{\mathrm{index}}(n) } = \lim_{n \to \infty} \frac{2n + 2}{n} = 2 \), we can conclude that \( 2n + 2 \in \Theta(n) \), and, correspondingly, that \( \mathrm{C}_{\mathrm{lessthan}} \in \Theta( \mathrm{C}_{\mathrm{index}} ) \).
Another Example
Since
\[ \begin{align} \lim_{n \to \infty} \frac{ \mathrm{C}_{\mathrm{index}}(n) }{ \mathrm{C}_{\mathrm{lessthan}}(n) } & = \lim_{n \to \infty} \frac{n}{ 2n + 2 } \\ & = \frac{1}{2} \end{align} \],
we can conclude that \( n \in \Theta( 2n + 2 ) \), and, correspondingly, that \( \mathrm{C}_{\mathrm{index}} \in \Theta( \mathrm{C}_{\mathrm{lessthan}} ) \).
In fact, it's not too hard to prove that, in general, if \( f(n) \in \Theta(g(n)) \), then \( g(n) \in \Theta( f(n) ) \).
That makes sense, because it's about similarity.
And one constant is the reciprocal of the other!
It would be strange if \( f \) were similar to \( g \) but \( g \) was not similar to \( f \)!
Order of Complexity
We can use asymptotic analysis to compare the scaling behavior of any two cost functions in the limit.
The most typical use, however, is to compare the cost of a specific algorithm to simple reference functions.
Remember that \( \Theta(g(n)) \) is a set of functions.
There are particular sets, given by particular choices for \( g(n) \), that are useful for categorizing functions with qualitatively different complexity properties.
Sometimes people call these sets orders of complexity.
Here are some common orders of complexity:
Order of Complexity | Name | Example |
---|---|---|
\( \Theta(1) \) | "constant time", "order 1" | hello world |
\( \Theta(\log{n}) \) | "logarithmic time", "order log \( n \)" | binary search |
\( \Theta(n) \) | "linear time", "order \( n \)" | linear search |
\( \Theta(n\log{n}) \) | "linearithmic time", "order \( n \log{n} \)" | merge sort |
\( \Theta(n^2) \) | "quadratic time", "order \( n^2 \)" | bubblesort |
\( \Theta(n^3) \) | "cubic time", "order \( n^3 \)" | matrix multiplication |
\( \Theta(n^k) \), for some \( k \) | "\( n^k \)-time", "order \( n^k \)", polynomial time | Voronoi diagram |
\( \Theta(k^n) \), for some \( k \) | "exponential time", "order \( k^n \)" | subset sum |
\( \Theta(n!) \) | "factorial time", "order \( n! \)" | traveling salesrep (brute force) |
\( \Theta(n^n) \) | "super-exponential time", "order \( n^n \)" | function enumeration |
(You can find a bigger table with more entries on Wikipedia.)
Oh, so we've already shown that \( \mathrm{C}_{\mathrm{lessthan}}(n) \) and \( \mathrm{C}_{\mathrm{index}}(n) \) are "order \( n\ \)"!
That's right!
\( \mathrm{C}_{\mathrm{index}}(n) = n \), which is obviously order \( n \).
We also showed that \( \mathrm{C}_{\mathrm{lessthan}}(n) \in \Theta(n) \), so it is order \( n \), too!
We can see that it isn't by taking the limit.
Since \[ \begin{align} \lim_{n \to \infty} \frac{ \mathrm{C}_{\mathrm{lessthan}} }{ 1 } & = \lim_{n \to \infty}{ 2n + 2} \\ & \to \infty, \end{align} \]
we can conclude that \( 2n + 2 \notin \Theta(1) \),
and, correspondingly, that \( \mathrm{C}_{\mathrm{lesthan}} \notin \Theta(1) \).
We can see that it isn't by taking the limit. Since
\[ \begin{align} \lim_{n \to \infty} \frac{ \mathrm{C}_{\mathrm{lessthan}} }{ n^2 } & = \lim_{n \to \infty} \frac{ 2n + 2 }{ n^2 } \\ & = 0, \end{align} \]
we can conclude that \( 2n + 2 \notin \Theta(n^2) \), and, correspondingly, that \( \mathrm{C}_{\mathrm{lessthan}} \notin \Theta(1) \).
One of the nice things about these particular reference sets is that they don't overlap!
No cost function can be both \( \Theta(n) \) and \( \Theta(n^2) \).
(When logged in, completion status appears here.)