Back to Data Structures and Algorithms

Queue (FIFO) Complete Guide

Data Structure • easy

Queues are FIFO (first-in, first-out): oldest item is processed first.

They are central to scheduling, buffering, stream processing, and breadth-first traversal.

If your system requirement is fairness and order preservation, queue is usually the right abstraction.

Typical Complexity Baseline

MetricValue
Enqueue/dequeueO(1)

See It

Concept Diagram

A small animated SVG that illustrates the core idea the moving parts you need to keep in your head when you read the rest of the guide.

Queue FIFO enqueue and dequeue

Items enter the tail and leave the head. A deque-backed queue gives O(1) on both ends.

FIFO · O(1) at both ends with a deque
Step 1/11Empty print queue. Jobs enqueue at the tail; the printer dequeues from the head.

Watch First

Video Explainer

A quick visual walkthrough before the detailed sections pick the format that helps you learn fastest.

Mechanics

Core Concepts

The building blocks and terminology you need before everything else clicks. Skim or deep-read.

Front

What it is Next item to dequeue.

Why it matters Front is a required building block for understanding how Queue (FIFO) stays correct and performant on large inputs.

Rear

What it is Newest enqueued item.

Why it matters Rear is a required building block for understanding how Queue (FIFO) stays correct and performant on large inputs.

Enqueue

What it is Add item at rear.

Why it matters Enqueue is a required building block for understanding how Queue (FIFO) stays correct and performant on large inputs.

Dequeue

What it is Remove item from front.

Why it matters Dequeue is a required building block for understanding how Queue (FIFO) stays correct and performant on large inputs.

Circular buffer

What it is Array-based queue with wrapped indices for O(1) ops.

Why it matters Circular buffer is a required building block for understanding how Queue (FIFO) stays correct and performant on large inputs.

FIFO

What it is First item in is first item out.

Why it matters First item in is first item out. In Queue (FIFO), this definition helps you reason about correctness and complexity when inputs scale.

Backpressure

What it is Signal to slow producers when consumers lag.

Why it matters Signal to slow producers when consumers lag. In Queue (FIFO), this definition helps you reason about correctness and complexity when inputs scale.

Bounded queue

What it is Queue with max capacity to control memory growth.

Why it matters Queue with max capacity to control memory growth. In Queue (FIFO), this definition helps you reason about correctness and complexity when inputs scale.

Deque

What it is Double-ended queue supporting push/pop at both ends.

Why it matters Double-ended queue supporting push/pop at both ends. In Queue (FIFO), this definition helps you reason about correctness and complexity when inputs scale.

Throughput

What it is Items processed per time unit.

Why it matters Items processed per time unit. In Queue (FIFO), this definition helps you reason about correctness and complexity when inputs scale.

Walkthrough

Putting It All Together

Every frame of the concept diagram, laid out top-to-bottom so you can scroll through the algorithm at your own pace.

Step 1/11
FIFO · O(1) at both ends with a deque

Empty print queue. Jobs enqueue at the tail; the printer dequeues from the head.

Step 2/11
FIFO · O(1) at both ends with a dequejob-1headtailenqueue(job-1) → added

Alice submits job-1.

Step 3/11
FIFO · O(1) at both ends with a dequejob-1job-2headtailenqueue(job-2) → added

Bob submits job-2.

Step 4/11
FIFO · O(1) at both ends with a dequejob-1job-2job-3headtailenqueue(job-3) → added

Carol submits job-3.

Step 5/11
FIFO · O(1) at both ends with a dequejob-2job-3headtaildequeue(job-1) → returned

Printer pulls the next job from the head → job-1 prints first (FIFO).

Step 6/11
FIFO · O(1) at both ends with a dequejob-2job-3job-4headtailenqueue(job-4) → added

Dave submits job-4 while job-1 is printing.

Step 7/11
FIFO · O(1) at both ends with a dequejob-3job-4headtaildequeue(job-2) → returned

Printer pulls job-2.

Step 8/11
FIFO · O(1) at both ends with a dequejob-3job-4job-5headtailenqueue(job-5) → added

Eve submits job-5.

Step 9/11
FIFO · O(1) at both ends with a dequejob-4job-5headtaildequeue(job-3) → returned

Printer pulls job-3.

Step 10/11
FIFO · O(1) at both ends with a dequejob-5headtaildequeue(job-4) → returned

Printer pulls job-4.

Step 11/11
FIFO · O(1) at both ends with a dequedequeue(job-5) → returned

Printer pulls job-5. Queue drained. Both ends are O(1) with a deque.

Tradeoffs

How It Compares

When this technique wins, when it loses, and what to reach for instead.

vs. Stack

Choose this when Choose queue for fairness and arrival order.

Tradeoff Stack is better for nested/latest-first behavior.

vs. Priority Queue

Choose this when Choose queue when strict arrival order must be respected.

Tradeoff Priority queue reorders by priority rather than arrival time.

vs. Direct synchronous call

Choose this when Choose queue to decouple producer and consumer speeds.

Tradeoff Queue adds operational complexity and eventual consistency concerns.

In Production

Real-World Stories

Where this shows up at real companies and what the on-call engineer learned the hard way.

Amazon

Order and payment workflows rely on queue-backed async processing to smooth traffic spikes.

Takeaway Queue decoupling protects core systems during burst load.

Uber

Event pipelines use queue semantics for telemetry and asynchronous processing jobs.

Takeaway Ordered buffering keeps distributed processing reliable.

Slack

Message/event delivery pipelines use queue-like buffering to handle variable consumer lag.

Takeaway Queues improve resilience under uneven traffic and temporary outages.

Code

Implementation

A clean reference implementation in TypeScript, Python, and Go. Switch languages with the toggle.

Use deque-like structure for O(1) enqueue/dequeue operations.

Complexity: Time O(1) per operation

Queue with deque for task scheduling

class TaskQueue<T> {
  private readonly values: T[] = []

  enqueue(value: T): void {
    this.values.push(value)
  }

  dequeue(): T | undefined {
    return this.values.shift()
  }

  size(): number {
    return this.values.length
  }
}

Watch Out

Common Problems and Failure Modes

The bugs and edge cases that bite engineers most often. Skim before you ship.

  • Unbounded queues causing memory blowups during traffic bursts.
  • Ignoring retry/idempotency semantics in queue consumers.
  • Dropping order guarantees accidentally in multi-worker setups.
  • Using list pop(0)-style operations that degrade performance.
  • No visibility into queue depth and consumer lag.

Pro Tips

Tips and Tricks

Small habits that pay off every time you reach for this pattern.

  • Queue is the default for fair first-in-first-out processing.
  • BFS problems usually become cleaner when queue items carry both node and depth.
  • Use bounded queues in real systems to avoid unbounded memory growth.
  • For fixed-size windows, a deque often models the problem more directly than a plain queue.

Pattern Recognition

When to Use

Concrete signals that tell you this is the right pattern both at work and on LeetCode.

Real-system usage signals

  • Predictable processing order across asynchronous workloads.
  • Natural model for producer-consumer pipelines.
  • Supports BFS and level-order traversal in graphs/trees.

LeetCode-specific tips (pattern-identification signals)

  • Identification signal: tasks/events must be handled in arrival order (FIFO), or traversal is explicitly level by level.
  • Identification signal: shortest path in an unweighted graph suggests BFS queue mechanics.
  • If the prompt says "process one layer at a time" or "time step per wave", queue is often correct.
  • For Queue (FIFO) questions, start by naming the core invariant before writing code.
  • Use the constraint section to set time/space target first, then pick the data structure/algorithm.
  • Solve one tiny example by hand and map each transition to your variables before implementing.
  • Run adversarial cases: empty input, duplicates, max-size input, and sorted/reverse patterns when relevant.
  • During interviews, explain why your approach is the right pattern for this prompt, not just why the code works.

Practice

LeetCode Progression

A curated easy-to-hard problem ladder. Each one reinforces a specific aspect of the pattern.

#ProblemDifficultyTypical Complexity
1Implement Queue using StacksEasyAmortized O(1)
2Number of Recent CallsEasyO(1) amortized
3Moving Average from Data StreamEasyO(1)
4Binary Tree Level Order TraversalMediumO(n)
5Rotting OrangesMediumO(m*n)
6Open the LockMediumO(V+E)
7Perfect SquaresMediumO(n*sqrt(n))
8Shortest Path in Binary MatrixMediumO(n^2)
9Sliding Window MaximumHardO(n)
10Bus RoutesHardO(V+E)