Generator functions pause at each yield and resume when the caller requests the next value. Nothing runs until you consume the iterator: .next(), a for...of loop, or yield*. Each value is produced on demand, which means you can walk a huge directory tree without holding all the paths in memory at once.

Directory traversal is a natural fit. The generator recurses into subdirectories with yield* and yields only the files matching a given extension filter:

import { readdir } from 'node:fs/promises';
import { extname, resolve } from 'node:path';

async function* walkDir(dir: string): AsyncGenerator<string> {
  const entries = await readdir(dir, { withFileTypes: true });
  for (const entry of entries) {
    const name = resolve(dir, entry.name);
    if (entry.isDirectory()) {
      yield* walkDir(name);
    } else if (filterFile(entry.name)) {
      yield name;
    }
  }
}

const filterFile = (file: string): boolean => {
  return ['.css', '.js', '.html', '.xml', '.cjs', '.mjs', '.svg', '.txt'].some(
    (ext) => extname(file) === ext,
  );
};

Because it’s async, the event loop stays unblocked. You can also wrap it in try-catch, break early with return, or pipe results directly into processing logic.

Consuming the generator

A for await...of loop is the most readable way to consume it:

for await (const file of walkDir('./src')) {
  console.log(file)
}

If you want to collect all results into an array first, Array.fromAsync works in Node 22+:

const files = await Array.fromAsync(walkDir('./src'))

Or fall back to the manual version for older runtimes:

const files: string[] = []
for await (const file of walkDir('./src')) {
  files.push(file)
}

Error handling

readdir throws if a directory isn’t readable. That will bubble up through the generator and can be caught at the call site:

try {
  for await (const file of walkDir('./src')) {
    process(file)
  }
} catch (err) {
  console.error('Failed to walk directory:', err)
}

To skip unreadable subdirectories instead of stopping entirely, catch inside the generator:

async function* walkDir(dir: string): AsyncGenerator<string> {
  let entries
  try {
    entries = await readdir(dir, { withFileTypes: true })
  } catch {
    return
  }
  for (const entry of entries) {
    const name = resolve(dir, entry.name)
    if (entry.isDirectory()) {
      yield* walkDir(name)
    } else if (filterFile(entry.name)) {
      yield name
    }
  }
}

Why not just readdir recursive?

You could collect everything with a recursive readdir and flatten the result. The difference shows up at scale. A tree with tens of thousands of files will load them all into memory before you can process any of them. The generator version processes each file as soon as it’s yielded, so memory usage stays flat regardless of tree size. For most projects this distinction doesn’t matter. For build tools or code analysis scripts running over large repos, it does.

The MDN docs on iterators and generators cover the underlying mechanics in detail.