Binary Heap INSERT
Who wants to try inserting into a binary heap?
I will!
A binary sheep for a binary heap! I love it!
Uh… It hasn't come up before, but actually I'm non-binary. My name is Mead (from meadow), and most sheep use they/them pronouns, because we prefer to see ourselves as a representative of our flock.
All good with me. A non-binary sheep for a binary heap is just as awesome!
Nothing wrong with being binary, though. Beep boop.
Great! Take it away, Mead!
Here's our heap so far:
INSERT — Step by Step
Okay, now say we need to insert a new item with key 10. Where can we put it in order to maintain the structure property? (i.e., that it's a complete tree)
I guess right at the end of the bottom layer?
Sure, let's try that!
Now, of course, we need to fix the order property! What should we do?
Maybe swap with its parent?
Makes sense!
…and then keep swapping until it's high enough.
Makes sense to me!
Is the tree complete?
Yup!
Does it satisfy the order property?
Yup!
Then it worked! Good job!
INSERT Algorithm
So here's the INSERT algorithm we just developed:
- Insert at the end of the last layer of the tree (which is the next available spot in the array).
- Swap with the parent until the parent has a smaller priority.
Exercise
Start with the heap pictured above (after we inserted 10) and, on paper, insert 20. (You can just draw the tree alone without the array representation; the array representation is easiest for a computer in its memory but the tree is easier for us to think about.)
Then write down the level-order traversal of the resulting tree after inserting 20 (this corresponds to the contents of the array).
INSERT Complexity
Step 1: Insert the Item at the End of the Last Layer
Remember that we encode our heap as an array, which makes it easy to know where the new node will go.
If there are currently \( n \) items in the heap, then the index for the new item is just \( n \).
We can add something to an array in \( \Theta(1) \) time!
If we need a dynamically resizing array that can grow and shrink, it would be amortized \( \Theta(1) \) but worst-case \( \Theta(n) \).
Step 2: Swap with Parent Until the Parent Has Smaller Priority
We know we can do a single swap in\( \Theta(1) \) time. So the real question is how many swaps do we need to do?
In the best case, we don't need to do any swaps (just compare to the parent once): \( \Theta(1) \) time.
In the worst case, we need to bubble all the way to the root! So how tall is the tree?
We figured out a long time ago that a balanced tree has \( \Theta( \log{n} ) \) height.
So in the worst case, we swap all the way to the top: \( \Theta( \log{n} ) \) time.
Overall complexity of insert
In the best case, insert takes \( \Theta(1) + \Theta(1) = \Theta(1) \) time.
In the worst case, insert takes \( \Theta(1) + \Theta( \log{n} ) = \Theta( \log{n} ) \) time.
So, the best we can say is that insert takes \( \mathrm{O}( \log{n} ) \) time—no worse than logarithmic time, but it could be faster (depending on the value we're inserting and the heap we're inserting into).
(When logged in, completion status appears here.)