Asymptotic Complexity Analysis
Now we'll consider an even more abstract way of thinking about the efficiency of an algorithm called asymptotic complexity analysis.
This type of analysis permits a short hand for putting algorithms into big categories.
Hay, that sounds familiar. I think we talked about this in CS 60?
Yes! You did start learning about this in CS 60 or CS 42! You may remember your dear old friend Big-O…?
Big-Oh no! I never really understood that stuff…
Well, today we are going to make asymptotic analysis more precise so maybe it will make sense this time around!
Asymptotic Analysis
Asymptotic analysis asks the question "How does the cost of the algorithm scale as input sizes grow toward infinity?"
It will not help with figuring out how much time the algorithm will take on a specific input or even how many occurrences of a particular operation it will perform.
It's about how costs grow, not their specific values. Therefore we don't care about constant factors/coefficients:
- \( n \), \( 0.00001n \), and \( 99999n \) all scale linearly
- \( n^2 \), \( 0.00001n^2 \), and \( 99999n^2 \) all scale quadratically
It's about arbitrarily huge inputs, not specific input sizes. Therefore we don't care about "small" terms:
- \( n^2 + 100n + 1000000 \approx n^2 \) for very large \( n \).
- \( 2^n + 500n^{10} + 6000n^{5} + 1234567 \approx 2^n \) for very large \( n \).
Why is this even a useful question to answer?
Well, the value may not be obvious at first, but this kind of analysis is very common in Computer Science.
It will let us make broad stroke comparisons between algorithms without getting bogged down in details.
It can also help us understand which algorithms can scale to large inputs, and which ones are only really practical in small examples.
Comparing Cost Functions
The core of asymptotic analysis is comparison of cost functions (i.e., operation counts!).
Given two cost functions (( f(n) \) and \( g(n) \), do they grow similarly as the input size goes to infinity, or is one growing faster than the other?
To make this comparison, we'll focus in on the following quantity:
$$ \lim_{ n \to \infty } \frac{f(n)}{g(n)} $$
Gah! Limits? Why would you do this to me?!?
Well… calculus is a prerequisite of this course…
Don't worry. We won't lean too heavily on calculus skills.
But if you can dredge up a little bit about how limits work, that will probably help you build intuition here.
When the limit is infinity…
If the limit of that ratio goes to infinity, then \( f(n) \) dominates \( g( n ) \) as \( n \to \infty \).
Thus \( f \) grows much faster than \( g \).
For example, if \( f(n) = n^2 \) and \( g(n) = n \), then \( \lim_{n \to \infty} \frac{n^2}{n} = \lim_{n \to \infty}n \).
When the limit is 0…
If the limit of our ratio goes to 0, then \( f(n) \) is dominated by \( g(n) \) as \( n \to \infty \).
Thus, \( f \) grows much more slowly than \( g \).
For example, if \( f(n) = n \) and \( g(n) = n^2 \), then \( \lim_{n \to \infty} \frac{n}{n^2} = \lim_{n \to \infty} \frac{1}{n} = 0 \).
When the limit is neither 0 nor infinity…
But what if we have an \(f\) and \(g\) where limit of the ratio is neither 0 nor infinity?
If the limit of that ratio goes to some non-zero, finite number, then neither \( f(n) \) nor \( g(n) \) dominates as \( n \to \infty \).
Thus \( f \) and \( g\) grow at qualitatively similar rates.
For example, if \( f(n) = n \) and \( g(n) = 2n \), then \( \lim_{n \to \infty} \frac{n}{2n} = \lim_{n \to \infty} \frac{1}{2} = \frac{1}{2} \).
So we can use this limit expression to compare two cost functions.
It lets us say at a high level whether one function scales much worse than the other for large inputs.
(When logged in, completion status appears here.)