Back to JavaScript and TypeScript

Async Iterators and for-await-of

Async iterators extend the iterator protocol to work with Promises. for-await-of lets you consume streams, paginated APIs, and async sequences with the same clean loop syntax as for-of no manual .next() calls needed.

Protocol

The Protocol

The async iterable protocol mirrors the sync version replace Symbol.iterator with Symbol.asyncIterator and make next() return a Promise. async function* handles all of this automatically.

The async iterable protocol

An async iterable implements [Symbol.asyncIterator]() returning an object whose next() method returns a **Promise** that resolves to { value, done }. for await...of calls this automatically.

// Manual async iterable
const asyncRange = {
  from: 1,
  to: 3,

  [Symbol.asyncIterator]() {
    let current = this.from
    const last = this.to
    return {
      async next() {
        await new Promise(r => setTimeout(r, 100))  // simulate async
        return current <= last
          ? { value: current++, done: false }
          : { value: undefined, done: true }
      }
    }
  }
}

for await (const n of asyncRange) {
  console.log(n)  // 1, 2, 3  (100ms apart)
}

Async generators the easy way

async function* combines generators and async/await. You can await inside the generator and yield values JavaScript handles the Promise wrapping automatically.

async function* fetchUsers(ids) {
  for (const id of ids) {
    const res  = await fetch(`/api/users/${id}`)
    const user = await res.json()
    yield user
  }
}

for await (const user of fetchUsers([1, 2, 3])) {
  console.log(user.name)  // fetched one at a time, in order
}

Streaming

Streaming Patterns

Async iterators are the natural fit for data that arrives over time file reads, paginated APIs, and streaming HTTP responses.

Streaming a large file line by line

Node.js Readable streams implement Symbol.asyncIterator. You can consume them with for await...of to process data chunk by chunk without loading everything into memory.

import { createReadStream } from "fs"
import { createInterface } from "readline"

async function* readLines(filePath) {
  const stream = createReadStream(filePath, { encoding: "utf8" })
  const rl = createInterface({ input: stream })

  for await (const line of rl) {
    yield line
  }
}

let lineCount = 0
for await (const line of readLines("./large.log")) {
  if (line.includes("ERROR")) {
    console.log(line)
  }
  lineCount++
}
console.log(`Processed ${lineCount} lines`)

Paginated API with async generator

An async generator hides pagination completely. The consumer just iterates the generator fetches the next page whenever needed. Clean separation of concerns.

async function* paginate(url) {
  while (url) {
    const res  = await fetch(url)
    const data = await res.json()

    for (const item of data.items) {
      yield item                     // one item at a time
    }

    url = data.links?.next ?? null   // null → stop
  }
}

for await (const product of paginate("/api/products")) {
  await indexProduct(product)
}

Consuming a ReadableStream (Web Streams API)

Browser ReadableStream (e.g. from fetch().body) can be read with for await...of in modern browsers and Node 18+. Great for streaming AI responses or large downloads.

async function streamText(url) {
  const res    = await fetch(url)
  const reader = res.body.getReader()
  const decoder = new TextDecoder()

  // Modern syntax  ReadableStream is async iterable
  for await (const chunk of res.body) {
    process.stdout.write(decoder.decode(chunk, { stream: true }))
  }

  // Or manually:
  while (true) {
    const { done, value } = await reader.read()
    if (done) break
    process.stdout.write(decoder.decode(value, { stream: true }))
  }
}

Utilities

Composition Utilities

Build reusable helpers for collecting, mapping, and filtering async iterables the same mental model as array methods, but lazy and non-blocking.

Collecting async iterable to array

Array.from does not work with async iterables. Use a simple helper or for await...of to collect values.

async function collect(asyncIterable) {
  const results = []
  for await (const item of asyncIterable) {
    results.push(item)
  }
  return results
}

const users = await collect(fetchUsers([1, 2, 3]))
console.log(users)  // [{ id: 1, ... }, { id: 2, ... }, { id: 3, ... }]

Mapping and filtering async iterables

You can compose async generators to build pipeline-style transforms like Array.map and Array.filter but for streams, without buffering everything into memory.

async function* map(iterable, fn) {
  for await (const item of iterable) {
    yield fn(item)
  }
}

async function* filter(iterable, predicate) {
  for await (const item of iterable) {
    if (await predicate(item)) yield item
  }
}

// Pipeline: fetch → filter active users → map to names
const names = collect(
  map(
    filter(fetchUsers([1, 2, 3, 4]), u => u.active),
    u => u.name
  )
)