An Overview of Arrays and Memory (Data Structures & Algorithms #2)
Channel: CS Dojo
Watch on YouTube →Data Structure • easy
Arrays are contiguous memory blocks where every element has an index. That direct index access is why arrays are the baseline building block for most other structures.
They are used when you need predictable iteration speed, compact memory layout, and simple serialization across APIs, files, and databases.
Most array interview questions are really about how you traverse: one pass, two passes, two pointers, or sorting first.
Typical Complexity Baseline
| Metric | Value |
|---|---|
| Read by index | O(1) |
| insert/delete in middle | O(n) |
See It
A small animated SVG that illustrates the core idea the moving parts you need to keep in your head when you read the rest of the guide.
Boxes are contiguous memory cells. The highlight slides over them in O(1) time per access.
Watch First
A quick visual walkthrough before the detailed sections pick the format that helps you learn fastest.
Channel: CS Dojo
Watch on YouTube →Mechanics
The building blocks and terminology you need before everything else clicks. Skim or deep-read.
What it is Zero-based position used for O(1) access.
Why it matters Index is a required building block for understanding how Array stays correct and performant on large inputs.
What it is Allocated slots; dynamic arrays grow capacity when full.
Why it matters Capacity is a required building block for understanding how Array stays correct and performant on large inputs.
What it is Current number of valid items.
Why it matters Length is a required building block for understanding how Array stays correct and performant on large inputs.
What it is Items are stored in adjacent memory, enabling fast scans.
Why it matters Contiguous memory is a required building block for understanding how Array stays correct and performant on large inputs.
What it is Most dynamic arrays grow geometrically to amortize append cost.
Why it matters Resize strategy is a required building block for understanding how Array stays correct and performant on large inputs.
What it is Reading element i directly without scanning previous elements.
Why it matters Reading element i directly without scanning previous elements. In Array, this definition helps you reason about correctness and complexity when inputs scale.
What it is Average append cost remains constant over many pushes despite occasional resize.
Why it matters Average append cost remains constant over many pushes despite occasional resize. In Array, this definition helps you reason about correctness and complexity when inputs scale.
What it is Algorithm updates the same array rather than allocating a second structure.
Why it matters Algorithm updates the same array rather than allocating a second structure. In Array, this definition helps you reason about correctness and complexity when inputs scale.
What it is Equal values keep original order after sorting.
Why it matters Equal values keep original order after sorting. In Array, this definition helps you reason about correctness and complexity when inputs scale.
What it is Runtime verification that index is within 0..length-1.
Why it matters Runtime verification that index is within 0..length-1. In Array, this definition helps you reason about correctness and complexity when inputs scale.
Walkthrough
Every frame of the concept diagram, laid out top-to-bottom so you can scroll through the algorithm at your own pace.
Array of 8 ints in contiguous memory. Each cell is the same size.
arr[0] = base + 0 × size → 3. One memory load.
arr[5] = base + 5 × size → 9. Same single load random access is O(1).
arr[7] = base + 7 × size → 6. Index doesn't matter every access is one hop.
arr[2] = 4. Compare with sequential search of a linked list that would walk 3 nodes.
arr[4] = 5.
arr[1] = 1. Cache locality is excellent: neighbors are pre-fetched.
arr[6] = 2. Insert/delete in the middle is O(n) (shift everything), but lookup stays O(1).
Tradeoffs
When this technique wins, when it loses, and what to reach for instead.
Choose this when Choose arrays when reads by position dominate.
Tradeoff Middle inserts/deletes are expensive because values shift.
Choose this when Choose arrays when order and numeric indexing matter.
Tradeoff Lookup by arbitrary key is slower than hash map lookup.
Choose this when Choose arrays for full scans and vectorized operations.
Tradeoff Arrays do not maintain priority ordering automatically.
In Production
Where this shows up at real companies and what the on-call engineer learned the hard way.
Large feed ranking pipelines process candidate posts as arrays/lists before downstream scoring stages.
Takeaway Array-friendly sequential processing keeps throughput high for ranking workloads.
Columnar analytics engines organize values in array-like vectors for scan-heavy SQL execution.
Takeaway Contiguous layouts reduce CPU cache misses during huge table scans.
Event batching systems often collect records in arrays before validation and persistence.
Takeaway Simple contiguous batches are easier to serialize and replay safely.
Code
A clean reference implementation in TypeScript, Python, and Go. Switch languages with the toggle.
Classic pattern for pair-sum style problems after sorting or when input is already sorted.
Complexity: Time O(n), Space O(1)
Two-pointer pair search in sorted array
function hasPairWithSum(values: number[], target: number): boolean {
let leftIndex = 0
let rightIndex = values.length - 1
while (leftIndex < rightIndex) {
const currentSum = values[leftIndex] + values[rightIndex]
if (currentSum === target) return true
if (currentSum < target) leftIndex += 1
else rightIndex -= 1
}
return false
}
Watch Out
The bugs and edge cases that bite engineers most often. Skim before you ship.
Pro Tips
Small habits that pay off every time you reach for this pattern.
O(n^2), check if one extra map can reduce it to O(n).O(n log n) cost often simplifies everything.Pattern Recognition
Concrete signals that tell you this is the right pattern both at work and on LeetCode.
O(n) or O(n log n) traversal over a linear collection, not graph-style modeling.Practice
A curated easy-to-hard problem ladder. Each one reinforces a specific aspect of the pattern.
| # | Problem | Difficulty | Typical Complexity |
|---|---|---|---|
| 1 | Two Sum | Easy | O(n) |
| 2 | Best Time to Buy and Sell Stock | Easy | O(n) |
| 3 | Move Zeroes | Easy | O(n) |
| 4 | Product of Array Except Self | Medium | O(n) |
| 5 | Maximum Subarray | Medium | O(n) |
| 6 | Merge Intervals | Medium | O(n log n) |
| 7 | Set Matrix Zeroes | Medium | O(m*n) |
| 8 | Trapping Rain Water | Hard | O(n) |
| 9 | First Missing Positive | Hard | O(n) |
| 10 | Median of Two Sorted Arrays | Hard | O(log(min(m,n))) |