Heaps Visually Explained (Priority Queues)
Channel: ByteQuest
Watch on YouTube →Data Structure • hard
Priority queue (usually implemented via heap) always exposes min or max element quickly.
It is ideal when you repeatedly need the next-best candidate rather than full sorting every step.
Many scheduling and shortest-path algorithms depend on priority queues for practical speed.
Typical Complexity Baseline
| Metric | Value |
|---|---|
| Push/pop | O(log n) |
| peek | O(1) |
See It
A small animated SVG that illustrates the core idea the moving parts you need to keep in your head when you read the rest of the guide.
Insert at the bottom, then swap with the parent while you are smaller.
Watch First
A quick visual walkthrough before the detailed sections pick the format that helps you learn fastest.
Channel: ByteQuest
Watch on YouTube →Mechanics
The building blocks and terminology you need before everything else clicks. Skim or deep-read.
What it is Compact tree representation in contiguous array.
Why it matters Heap array is a required building block for understanding how Heap / Priority Queue stays correct and performant on large inputs.
What it is Defines min-heap or max-heap ordering.
Why it matters Comparator is a required building block for understanding how Heap / Priority Queue stays correct and performant on large inputs.
What it is Restore heap property after insert.
Why it matters Sift-up is a required building block for understanding how Heap / Priority Queue stays correct and performant on large inputs.
What it is Restore heap property after pop/replace.
Why it matters Sift-down is a required building block for understanding how Heap / Priority Queue stays correct and performant on large inputs.
What it is Parent priority dominates children.
Why it matters Heap invariant is a required building block for understanding how Heap / Priority Queue stays correct and performant on large inputs.
What it is Smallest key stored at root/top.
Why it matters Smallest key stored at root/top. In Heap / Priority Queue, this definition helps you reason about correctness and complexity when inputs scale.
What it is Largest key stored at root/top.
Why it matters Largest key stored at root/top. In Heap / Priority Queue, this definition helps you reason about correctness and complexity when inputs scale.
What it is Build heap from unsorted array in O(n).
Why it matters Build heap from unsorted array in O(n). In Heap / Priority Queue, this definition helps you reason about correctness and complexity when inputs scale.
What it is Return k best elements under ranking rule.
Why it matters Return k best elements under ranking rule. In Heap / Priority Queue, this definition helps you reason about correctness and complexity when inputs scale.
What it is Priority update operation in some heap variants.
Why it matters Priority update operation in some heap variants. In Heap / Priority Queue, this definition helps you reason about correctness and complexity when inputs scale.
Walkthrough
Every frame of the concept diagram, laid out top-to-bottom so you can scroll through the algorithm at your own pace.
Empty min-heap. We'll insert 7 values and watch each bubble up.
Insert 8 at index 0 (next leaf).
8 reached the root it's the new minimum.
Insert 5 at index 1 (next leaf).
Compare with parent arr[0]=8. 5 < 8 → swap.
After swap: arr = [5, 8]. Now at index 0.
5 reached the root it's the new minimum.
Insert 9 at index 2 (next leaf).
Compare with parent arr[0]=5. 5 ≤ 9 → stop.
Insert 3 at index 3 (next leaf).
Compare with parent arr[1]=8. 3 < 8 → swap.
After swap: arr = [5, 3, 9, 8]. Now at index 1.
Compare with parent arr[0]=5. 3 < 5 → swap.
After swap: arr = [3, 5, 9, 8]. Now at index 0.
3 reached the root it's the new minimum.
Insert 7 at index 4 (next leaf).
Compare with parent arr[1]=5. 5 ≤ 7 → stop.
Insert 1 at index 5 (next leaf).
Compare with parent arr[2]=9. 1 < 9 → swap.
After swap: arr = [3, 5, 1, 8, 7, 9]. Now at index 2.
Compare with parent arr[0]=3. 1 < 3 → swap.
After swap: arr = [1, 5, 3, 8, 7, 9]. Now at index 0.
1 reached the root it's the new minimum.
Insert 6 at index 6 (next leaf).
Compare with parent arr[2]=3. 3 ≤ 6 → stop.
Heap built. Each insert is O(log n); building from n inserts is O(n log n) (the heapify-from-array trick is O(n)).
Tradeoffs
When this technique wins, when it loses, and what to reach for instead.
Choose this when Choose heap for interleaved insert + extract-top operations.
Tradeoff Sorted array is simpler if data is static and query-only.
Choose this when Choose heap for ordered-by-priority retrieval.
Tradeoff Hash map cannot return min/max efficiently.
Choose this when Choose heap for minimal overhead top extraction.
Tradeoff Tree supports ordered iteration and arbitrary deletions better.
In Production
Where this shows up at real companies and what the on-call engineer learned the hard way.
Scheduler-style systems rank candidate nodes/tasks and repeatedly choose the highest-priority next action.
Takeaway Priority queues formalize best-next selection under changing load.
Shortest path variants use min-priority queue to expand nearest frontier first.
Takeaway Heap-driven frontier selection keeps path search efficient.
Some stream processing jobs use priority queues for watermark/event-time ordering.
Takeaway Priority structures help enforce processing order constraints.
Code
A clean reference implementation in TypeScript, Python, and Go. Switch languages with the toggle.
Implement the heap array and sift-up/sift-down operations yourself so enqueue/dequeue behavior is explicit.
Complexity: Push O(log n), Pop O(log n), Peek O(1)
Min-heap priority queue from scratch
class MinHeap {
private values: number[] = []
push(value: number): void {
this.values.push(value)
this.siftUp(this.values.length - 1)
}
pop(): number | null {
if (this.values.length === 0) {
return null
}
const rootValue = this.values[0]
const lastValue = this.values.pop()!
if (this.values.length > 0) {
this.values[0] = lastValue
this.siftDown(0)
}
return rootValue
}
peek(): number | null {
return this.values.length > 0 ? this.values[0] : null
}
private siftUp(startIndex: number): void {
let childIndex = startIndex
while (childIndex > 0) {
const parentIndex = Math.floor((childIndex - 1) / 2)
if (this.values[parentIndex] <= this.values[childIndex]) {
break
}
;[this.values[parentIndex], this.values[childIndex]] = [this.values[childIndex], this.values[parentIndex]]
childIndex = parentIndex
}
}
private siftDown(startIndex: number): void {
let parentIndex = startIndex
while (true) {
const leftChildIndex = parentIndex * 2 + 1
const rightChildIndex = parentIndex * 2 + 2
let smallestIndex = parentIndex
if (leftChildIndex < this.values.length && this.values[leftChildIndex] < this.values[smallestIndex]) {
smallestIndex = leftChildIndex
}
if (rightChildIndex < this.values.length && this.values[rightChildIndex] < this.values[smallestIndex]) {
smallestIndex = rightChildIndex
}
if (smallestIndex === parentIndex) {
break
}
;[this.values[parentIndex], this.values[smallestIndex]] = [this.values[smallestIndex], this.values[parentIndex]]
parentIndex = smallestIndex
}
}
}
Watch Out
The bugs and edge cases that bite engineers most often. Skim before you ship.
Pro Tips
Small habits that pay off every time you reach for this pattern.
Pattern Recognition
Concrete signals that tell you this is the right pattern both at work and on LeetCode.
O(1) with insert/remove O(log n).Practice
A curated easy-to-hard problem ladder. Each one reinforces a specific aspect of the pattern.
| # | Problem | Difficulty | Typical Complexity |
|---|---|---|---|
| 1 | Kth Largest Element in a Stream | Easy | O(log k) |
| 2 | Last Stone Weight | Easy | O(n log n) |
| 3 | Top K Frequent Elements | Medium | O(n log k) |
| 4 | K Closest Points to Origin | Medium | O(n log k) |
| 5 | Task Scheduler | Medium | O(n log n) |
| 6 | Find K Pairs with Smallest Sums | Medium | O(k log k) |
| 7 | Merge k Sorted Lists | Hard | O(n log k) |
| 8 | Find Median from Data Stream | Hard | O(log n) |
| 9 | IPO | Hard | O(n log n) |
| 10 | Minimum Number of Refueling Stops | Hard | O(n log n) |