Learn Merge Sort in 13 minutes
Channel: Bro Code
Watch on YouTube →Algorithm • hard
Merge sort uses divide-and-conquer: split list, sort halves recursively, merge halves in order.
Its runtime is predictably O(n log n) regardless of input distribution.
The tradeoff is extra memory for merge buffers in typical implementations.
Typical Complexity Baseline
| Metric | Value |
|---|---|
| Time | O(n log n) |
| Space | O(n) |
See It
A small animated SVG that illustrates the core idea the moving parts you need to keep in your head when you read the rest of the guide.
The two sorted halves are merged in linear time using two read pointers.
Watch First
A quick visual walkthrough before the detailed sections pick the format that helps you learn fastest.
Channel: Bro Code
Watch on YouTube →Mechanics
The building blocks and terminology you need before everything else clicks. Skim or deep-read.
What it is Recursively split list into two halves.
Why it matters Divide step is a required building block for understanding how Merge Sort stays correct and performant on large inputs.
What it is Sort each half independently.
Why it matters Conquer step is a required building block for understanding how Merge Sort stays correct and performant on large inputs.
What it is Linear merge of two sorted halves.
Why it matters Merge step is a required building block for understanding how Merge Sort stays correct and performant on large inputs.
What it is Auxiliary space used during merge.
Why it matters Temporary buffer is a required building block for understanding how Merge Sort stays correct and performant on large inputs.
What it is Single-element list is already sorted.
Why it matters Base case is a required building block for understanding how Merge Sort stays correct and performant on large inputs.
What it is Break a problem into subproblems, solve, then combine.
Why it matters Break a problem into subproblems, solve, then combine. In Merge Sort, this definition helps you reason about correctness and complexity when inputs scale.
What it is Equal keys keep original relative ordering.
Why it matters Equal keys keep original relative ordering. In Merge Sort, this definition helps you reason about correctness and complexity when inputs scale.
What it is Sort chunks then merge for data that exceeds memory.
Why it matters Sort chunks then merge for data that exceeds memory. In Merge Sort, this definition helps you reason about correctness and complexity when inputs scale.
What it is T(n)=2T(n/2)+O(n) leads to O(n log n).
Why it matters T(n)=2T(n/2)+O(n) leads to O(n log n). In Merge Sort, this definition helps you reason about correctness and complexity when inputs scale.
What it is Extra memory used beyond original array.
Why it matters Extra memory used beyond original array. In Merge Sort, this definition helps you reason about correctness and complexity when inputs scale.
Walkthrough
Every frame of the concept diagram, laid out top-to-bottom so you can scroll through the algorithm at your own pace.
Start: [5, 2, 8, 4, 1, 7, 3, 6]. Merge sort halves the array, recurses, then merges sorted halves.
Divide level 0 → [5, 2, 8, 4] and [1, 7, 3, 6].
Divide each half again → 4 pairs.
Divide once more → 8 singletons. Each is trivially sorted.
Merge pairs of singletons. [5]+[2]→[2,5], [8]+[4]→[4,8], [1]+[7] stays, [3]+[6] stays.
Merge pairs of pairs. [2,5]+[4,8]→[2,4,5,8] (via two pointers). [1,7]+[3,6]→[1,3,6,7].
Final merge of two halves of 4. Walk pointers 1<2 take 1, 2<3 take 2, 3<4 take 3 ... → [1, 2, 3, 4, 5, 6, 7, 8]. O(n log n).
Tradeoffs
When this technique wins, when it loses, and what to reach for instead.
Choose this when Choose merge sort when worst-case guarantee and stability matter.
Tradeoff Usually needs more memory than in-place quick sort.
Choose this when Choose merge sort for stable behavior and simpler parallel merging.
Tradeoff Heap sort uses less memory but is not stable by default.
Choose this when Choose merge sort for any medium or large dataset.
Tradeoff Bubble is simpler but not performant enough for scale.
In Production
Where this shows up at real companies and what the on-call engineer learned the hard way.
Database operators use merge-style sorting and merging in query execution plans for large ordered outputs.
Takeaway Predictable ordered processing is critical for database consistency and performance.
Large-scale jobs split data, sort partitions, and merge results across distributed nodes.
Takeaway Merge-centric workflows are foundational in big-data pipelines.
Indexing pipelines rely on sorted segment merges to build and compact search indexes efficiently.
Takeaway Merge patterns scale well when data arrives in batches.
Code
A clean reference implementation in TypeScript, Python, and Go. Switch languages with the toggle.
Recursively split to single elements, then merge sorted halves in linear time.
Complexity: Time O(n log n), Space O(n)
Recursive merge sort
function mergeSort(values: number[]): number[] {
if (values.length <= 1) return values
const midIndex = Math.floor(values.length / 2)
const leftHalf = mergeSort(values.slice(0, midIndex))
const rightHalf = mergeSort(values.slice(midIndex))
const merged: number[] = []
let leftIndex = 0
let rightIndex = 0
while (leftIndex < leftHalf.length && rightIndex < rightHalf.length) {
if (leftHalf[leftIndex] <= rightHalf[rightIndex]) merged.push(leftHalf[leftIndex++])
else merged.push(rightHalf[rightIndex++])
}
return [...merged, ...leftHalf.slice(leftIndex), ...rightHalf.slice(rightIndex)]
}
Watch Out
The bugs and edge cases that bite engineers most often. Skim before you ship.
Pro Tips
Small habits that pay off every time you reach for this pattern.
O(n log n) baseline when predictability matters.Pattern Recognition
Concrete signals that tell you this is the right pattern both at work and on LeetCode.
O(n log n) behavior is needed and extra memory is acceptable.Practice
A curated easy-to-hard problem ladder. Each one reinforces a specific aspect of the pattern.
| # | Problem | Difficulty | Typical Complexity |
|---|---|---|---|
| 1 | Merge Sorted Array | Easy | O(n+m) |
| 2 | Sort an Array | Medium | O(n log n) |
| 3 | Sort List | Medium | O(n log n) |
| 4 | Merge Intervals | Medium | O(n log n) |
| 5 | Kth Largest Element in an Array | Medium | O(n) avg with quickselect |
| 6 | Largest Number | Medium | O(n log n) |
| 7 | Meeting Rooms II | Medium | O(n log n) |
| 8 | Count of Smaller Numbers After Self | Hard | O(n log n) |
| 9 | Reverse Pairs | Hard | O(n log n) |
| 10 | Maximum Gap | Hard | O(n) buckets optimal |