Iterator and Generator Patterns
The Hidden Protocol Behind Everything
When you write for...of, spread syntax, destructuring, or Array.from(), JavaScript doesn't care what the data source is — an array, a string, a Map, a Set, or something you invented. It only cares about one thing: does the object implement the iterator protocol?
for (const item of myCollection) { /* ... */ }
const copy = [...myCollection];
const [first, second] = myCollection;
All three statements call the same protocol under the hood. If you understand this protocol, you can make any object work with these syntaxes — lazy infinite sequences, paginated API results, file streams, tree traversals. You control how data flows out, one value at a time.
Think of an iterator like a bookmark in a book. The book (iterable) contains all the content, and the bookmark (iterator) tracks where you are. Each time you call next(), the bookmark advances to the next page and gives you that page's content. When you reach the end, it tells you the book is done. Multiple bookmarks can exist in the same book at different positions. And importantly — the book doesn't need to have all its pages printed in advance. It could be generating them as you read (lazy evaluation).
The Iterator Protocol
An object is an iterator if it has a next() method that returns { value, done }. An object is an iterable if it has a [Symbol.iterator]() method that returns an iterator.
class Range {
constructor(private start: number, private end: number) {}
[Symbol.iterator](): Iterator<number> {
let current = this.start;
const end = this.end;
return {
next(): IteratorResult<number> {
if (current <= end) {
return { value: current++, done: false };
}
return { value: undefined, done: true };
},
};
}
}
const range = new Range(1, 5);
for (const num of range) {
console.log(num); // 1, 2, 3, 4, 5
}
console.log([...range]); // [1, 2, 3, 4, 5]
const [a, b, c] = range; // a=1, b=2, c=3
The Range class is iterable. It works with for...of, spread, destructuring, and Array.from() — all because it implements [Symbol.iterator]().
Generators — Iterators Made Easy
Writing iterators manually is verbose. Generators are syntactic sugar for creating iterators with much less boilerplate:
function* range(start: number, end: number): Generator<number> {
for (let i = start; i <= end; i++) {
yield i;
}
}
for (const num of range(1, 5)) {
console.log(num); // 1, 2, 3, 4, 5
}
The function* syntax and yield keyword do all the state management for you. Each yield pauses the function, returns a value, and resumes from that exact point on the next next() call. The execution context is preserved between yields — local variables, loop counters, everything.
Lazy Evaluation
Generators are lazy — they produce values one at a time, only when asked. This means you can create infinite sequences without blowing up memory:
function* fibonacci(): Generator<number> {
let a = 0;
let b = 1;
while (true) {
yield a;
[a, b] = [b, a + b];
}
}
function take<T>(iterable: Iterable<T>, count: number): T[] {
const result: T[] = [];
for (const item of iterable) {
result.push(item);
if (result.length >= count) break;
}
return result;
}
take(fibonacci(), 10);
// [0, 1, 1, 2, 3, 5, 8, 13, 21, 34]
The fibonacci generator runs forever, but take only pulls 10 values. No array of infinite size is ever created. This is the power of lazy evaluation — computation happens on demand.
Composing Generators
Generators compose beautifully. You can build complex data pipelines from simple generator functions:
function* map<T, U>(iterable: Iterable<T>, fn: (item: T) => U): Generator<U> {
for (const item of iterable) {
yield fn(item);
}
}
function* filter<T>(iterable: Iterable<T>, predicate: (item: T) => boolean): Generator<T> {
for (const item of iterable) {
if (predicate(item)) yield item;
}
}
function* flatMap<T, U>(iterable: Iterable<T>, fn: (item: T) => Iterable<U>): Generator<U> {
for (const item of iterable) {
yield* fn(item);
}
}
function* chunk<T>(iterable: Iterable<T>, size: number): Generator<T[]> {
let batch: T[] = [];
for (const item of iterable) {
batch.push(item);
if (batch.length >= size) {
yield batch;
batch = [];
}
}
if (batch.length > 0) yield batch;
}
Now chain them into a lazy pipeline:
const numbers = range(1, 1_000_000);
const result = take(
map(
filter(numbers, n => n % 2 === 0),
n => n * n
),
5
);
// [4, 16, 36, 64, 100]
// Only 10 numbers were actually generated — not 1,000,000!
Each generator in the chain only produces values when the consumer asks. take(5) pulls from map, which pulls from filter, which pulls from range. The million-element range is never materialized.
Async Generators — Streaming Data
Async generators combine generators with async/await for streaming data from APIs, WebSockets, or files:
async function* fetchPages<T>(
baseUrl: string,
pageSize: number
): AsyncGenerator<T[]> {
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await fetch(
`${baseUrl}?page=${page}&limit=${pageSize}`
);
const data = await response.json();
yield data.items as T[];
hasMore = data.items.length === pageSize;
page++;
}
}
async function* flattenPages<T>(
pages: AsyncGenerator<T[]>
): AsyncGenerator<T> {
for await (const page of pages) {
yield* page;
}
}
Usage with for await...of:
const users = fetchPages<{ id: string; name: string }>("/api/users", 20);
for await (const page of users) {
console.log(`Got ${page.length} users`);
renderUsers(page);
}
// Or flatten and process individually:
const allUsers = flattenPages(fetchPages("/api/users", 20));
for await (const user of allUsers) {
console.log(user.name);
}
The beauty: pages are fetched one at a time, on demand. If the consumer breaks out of the loop after 2 pages, the third page is never fetched. No wasted network requests.
Production Scenario: Event Stream Processing
Async generators are perfect for processing real-time event streams:
async function* fromWebSocket<T>(url: string): AsyncGenerator<T> {
const ws = new WebSocket(url);
const queue: T[] = [];
let resolve: (() => void) | null = null;
let done = false;
ws.onmessage = (event) => {
queue.push(JSON.parse(event.data));
resolve?.();
};
ws.onclose = () => {
done = true;
resolve?.();
};
try {
await new Promise<void>((r) => { ws.onopen = () => r(); });
while (!done) {
if (queue.length === 0) {
await new Promise<void>((r) => { resolve = r; });
resolve = null;
}
while (queue.length > 0) {
yield queue.shift()!;
}
}
} finally {
ws.close();
}
}
async function* throttle<T>(
source: AsyncGenerator<T>,
ms: number
): AsyncGenerator<T> {
let lastYield = 0;
for await (const item of source) {
const now = Date.now();
if (now - lastYield >= ms) {
yield item;
lastYield = now;
}
}
}
const messages = fromWebSocket<{ type: string; data: unknown }>(
"wss://stream.example.com"
);
const throttled = throttle(messages, 100);
for await (const msg of throttled) {
updateUI(msg);
}
Tree Traversal with Generators
Generators make tree traversal elegant — recursive traversal with yield*:
interface TreeNode<T> {
value: T;
children: TreeNode<T>[];
}
function* depthFirst<T>(node: TreeNode<T>): Generator<T> {
yield node.value;
for (const child of node.children) {
yield* depthFirst(child);
}
}
function* breadthFirst<T>(root: TreeNode<T>): Generator<T> {
const queue: TreeNode<T>[] = [root];
while (queue.length > 0) {
const node = queue.shift()!;
yield node.value;
queue.push(...node.children);
}
}
const tree: TreeNode<string> = {
value: "root",
children: [
{ value: "a", children: [
{ value: "a1", children: [] },
{ value: "a2", children: [] },
]},
{ value: "b", children: [
{ value: "b1", children: [] },
]},
],
};
console.log([...depthFirst(tree)]);
// ["root", "a", "a1", "a2", "b", "b1"]
console.log([...breadthFirst(tree)]);
// ["root", "a", "b", "a1", "a2", "b1"]
The yield* syntax delegates to another generator, making recursive structures trivial to iterate.
| What developers do | What they should do |
|---|---|
| Collecting an entire async generator into an array before processing Collecting defeats the purpose of streaming. If you await all pages of a paginated API into an array, you waste memory and make the user wait for ALL pages before seeing ANY data. Process incrementally for better UX and memory efficiency | Process items as they arrive with for await...of to maintain backpressure |
| Forgetting to handle generator cleanup (the finally block) When a consumer breaks out of a for...of loop early, the generator's return() method is called. If your generator holds resources, the finally block runs during return(), giving you a chance to clean up. Without it, WebSocket connections and file handles leak | Use try/finally in generators that hold resources like WebSocket connections or file handles |
| Using generators for simple array transformations Array methods are optimized in V8 and produce readable code for in-memory data. Generators add overhead (context switching, iterator protocol) that is only justified when you need laziness (huge/infinite datasets) or async streaming | Use array methods (map, filter, reduce) for in-memory arrays. Use generators for lazy evaluation, infinite sequences, or async streams |
Challenge
Build a lazy pipeline class that chains map, filter, and take operations on any iterable, executing lazily only when consumed.
Try to solve it before peeking at the answer.
// Requirements:
// 1. LazyPipeline wraps any iterable
// 2. .map(), .filter(), .take() return new LazyPipeline instances
// 3. Nothing executes until .toArray() or for...of
// 4. Each step is lazy — only processes what the next step needs
// const result = LazyPipeline.from(range(1, 1_000_000))
// .filter(n => n % 3 === 0)
// .map(n => n * n)
// .take(5)
// .toArray();
// // [9, 36, 81, 144, 225]
// // Only ~15 numbers generated from range, not 1,000,000- 1The iterator protocol (Symbol.iterator + next) powers for...of, spread, destructuring, and Array.from — implement it to make any object work with these syntaxes
- 2Generators (function* with yield) are syntactic sugar for creating iterators — they preserve local state between yields automatically
- 3Generators are lazy by default — values are produced one at a time, on demand, making infinite sequences and huge datasets memory-efficient
- 4Use async generators (async function* with yield) for streaming data: paginated APIs, WebSocket messages, file reading
- 5yield* delegates to another iterable, making recursive structures like tree traversal clean and composable