Node.js Interview Prep
Asynchronous Patterns

Promises & async/await -- Modern Async in Node.js

Promises & async/await -- Modern Async in Node.js

LinkedIn Hook

"Your Node.js script takes 30 seconds to fetch 10 URLs. It should take 3."

Here is the bug I see in production code at least once a week:

for (const url of urls) {
  const data = await fetch(url);   // waits for each one!
  results.push(data);
}

That await inside a for loop runs every request sequentially. Ten 300ms requests become a 3-second wall-clock delay -- when they could all fly in parallel and finish in 300ms total.

The fix is one line: await Promise.all(urls.map(fetch)). But the deeper lesson is understanding when to await serially (when each step depends on the previous) and when to fan out in parallel (when the steps are independent).

Node.js gives you four combinators -- Promise.all, Promise.allSettled, Promise.race, Promise.any -- and each one solves a different problem. Knowing which to reach for is the difference between a slow API and a fast one.

In Lesson 5.2, I walk through promise-based core APIs (fs.promises, dns.promises, timers/promises), async/await error handling, util.promisify, and the parallel-vs-sequential decision tree.

Read the full lesson -> [link]

#NodeJS #JavaScript #AsyncAwait #Promises #BackendDevelopment #InterviewPrep


Promises & async/await -- Modern Async in Node.js thumbnail


What You'll Learn

  • How promise-based core APIs (fs.promises, dns.promises, timers/promises) replace callback versions
  • async/await syntax and how it desugars to .then() chains
  • The difference between sequential await and parallel Promise.all
  • When to use Promise.all vs Promise.allSettled vs Promise.race vs Promise.any
  • The most common performance mistake: awaiting inside a for loop
  • Error handling with try/catch around await
  • Why unhandled promise rejections crash modern Node.js
  • How util.promisify converts legacy callback APIs to promises

The Restaurant Kitchen Analogy -- Serial vs Parallel Cooking

Imagine you order a burger, fries, and a milkshake at a restaurant. There are two ways the kitchen can prepare your meal.

Serial kitchen (one cook, one task at a time): The cook grills the burger for 5 minutes. Then starts the fries -- 4 minutes. Then makes the milkshake -- 2 minutes. Total: 11 minutes. You stare at an empty table.

Parallel kitchen (one cook, three burners): The cook puts the burger on the grill, drops the fries in the fryer, and starts the milkshake blender -- all within seconds of each other. They cook at the same time. Total: 5 minutes (the longest single task). You eat much sooner.

The cook is your Node.js process. The single-threaded event loop is the cook. The grill, fryer, and blender are I/O operations -- file reads, HTTP requests, database queries -- that the operating system handles in the background. The cook does not have to stand in front of the grill the whole time. They start the task and walk away.

await in a for loop is the serial kitchen. You start the burger, wait until it is done, then start the fries. The fryer sits idle. The blender sits idle. You waste the parallelism that the operating system is offering you for free.

Promise.all is the parallel kitchen. You start all three tasks immediately and wait for the slowest one to finish. The total time becomes the time of the longest task, not the sum of all tasks.

+---------------------------------------------------------------+
|           SERIAL await IN A LOOP (slow)                       |
+---------------------------------------------------------------+
|                                                                |
|  Task A  |##########|                                          |
|  Task B              |########|                                |
|  Task C                       |######|                         |
|                                                                |
|  Total wall time: A + B + C  =  26 units                       |
|                                                                |
+---------------------------------------------------------------+

+---------------------------------------------------------------+
|           PARALLEL Promise.all (fast)                         |
+---------------------------------------------------------------+
|                                                                |
|  Task A  |##########|                                          |
|  Task B  |########|                                            |
|  Task C  |######|                                              |
|                                                                |
|  Total wall time: max(A, B, C)  =  10 units                    |
|                                                                |
+---------------------------------------------------------------+

Napkin AI Visual Prompt: "Dark gradient (#0a1a0a -> #0d2e16). Two horizontal Gantt-style timelines stacked vertically. TOP timeline labeled 'await in for loop' shows three red bars placed end-to-end summing to 26 units. BOTTOM timeline labeled 'Promise.all' shows three green bars stacked at the same start time, the longest being 10 units. A vertical amber line marks the 10-unit point. Title: '16 units wasted'. White monospace labels."


Promise-Based Core APIs -- The Modern Way

Every Node.js core module that does I/O ships a promise-based version. These live alongside the callback versions and return promises directly, so you can await them without writing wrappers.

fs.promises -- Filesystem Without Callbacks

// fs-read.js
// Import the promise-based fs API directly.
// Note: 'node:fs/promises' is the modern, recommended import path.
const fs = require('node:fs/promises');

async function readConfig() {
  try {
    // await pauses this function until the file is read.
    // The event loop keeps running other work in the meantime.
    const data = await fs.readFile('./config.json', 'utf8');

    // JSON.parse is synchronous -- but the slow part (disk I/O) was async.
    const config = JSON.parse(data);

    console.log('Loaded config:', config);
    return config;
  } catch (err) {
    // A single try/catch handles both the file read failure
    // AND the JSON parse failure -- much cleaner than nested callbacks.
    console.error('Failed to load config:', err.message);
    throw err;
  }
}

readConfig();

Compare this to the callback version from Lesson 5.1: no nested indentation, no if (err) at every step, errors propagate up the call stack like normal exceptions.

dns.promises and timers/promises

// dns-and-timers.js
const dns = require('node:dns/promises');
const { setTimeout: sleep } = require('node:timers/promises');

async function lookupWithDelay() {
  // Resolve the A record for a hostname -- returns a promise.
  const addresses = await dns.resolve4('nodejs.org');
  console.log('Found IPs:', addresses);

  // timers/promises gives you a promise-returning setTimeout.
  // No more wrapping setTimeout in `new Promise(resolve => ...)`.
  console.log('Sleeping for 1 second...');
  await sleep(1000);

  console.log('Done sleeping.');
}

lookupWithDelay();

The timers/promises module is one of the most underused gems in Node.js. Before it existed, every project had its own sleep helper. Now you import it from core.


async/await -- Sugar Over Promises

async and await are syntactic sugar over .then() chains. An async function always returns a promise. await unwraps a promise inside an async function and pauses execution until it settles.

// async-basics.js
const fs = require('node:fs/promises');

// These two functions are functionally equivalent.

// Version 1: explicit .then() chain
function loadUserChain(id) {
  return fs.readFile(`./users/${id}.json`, 'utf8')
    .then(text => JSON.parse(text))
    .then(user => {
      console.log('User:', user.name);
      return user;
    })
    .catch(err => {
      console.error('Load failed:', err.message);
      throw err;
    });
}

// Version 2: async/await
async function loadUserAsync(id) {
  try {
    const text = await fs.readFile(`./users/${id}.json`, 'utf8');
    const user = JSON.parse(text);
    console.log('User:', user.name);
    return user;
  } catch (err) {
    console.error('Load failed:', err.message);
    throw err;
  }
}

The async/await version reads top-to-bottom like synchronous code, but the runtime behavior is identical. Under the hood, the JavaScript engine transforms each await into a .then() continuation.


The Big Mistake -- Serial Awaits in a Loop

This is the single most common performance bug in Node.js code. It looks correct. It produces the right output. It runs 10x slower than it should.

// serial-vs-parallel.js
const fs = require('node:fs/promises');

const files = ['a.json', 'b.json', 'c.json', 'd.json', 'e.json'];

// BAD: each readFile waits for the previous one to finish.
// If each read takes 100ms, this takes 500ms total.
async function loadSerial() {
  console.time('serial');
  const results = [];
  for (const file of files) {
    // The await here pauses the loop.
    // Disk reads happen one after another, not in parallel.
    const text = await fs.readFile(file, 'utf8');
    results.push(JSON.parse(text));
  }
  console.timeEnd('serial');  // ~500ms
  return results;
}

// GOOD: start all reads immediately, then wait for all of them together.
// Total time = the slowest single read, not the sum.
async function loadParallel() {
  console.time('parallel');
  // Step 1: kick off every readFile RIGHT NOW.
  // .map() returns an array of promises that are already running.
  const promises = files.map(file => fs.readFile(file, 'utf8'));

  // Step 2: wait for all of them to settle, then parse each result.
  const texts = await Promise.all(promises);
  const results = texts.map(text => JSON.parse(text));

  console.timeEnd('parallel');  // ~100ms
  return results;
}

loadSerial().then(() => loadParallel());

When should you actually use serial awaits? Only when each step depends on the result of the previous step. For example: fetching a user, then using the user's accountId to fetch their account, then using the account to fetch transactions. Each call needs the previous answer, so you cannot parallelize.

If the calls are independent -- like reading 5 unrelated files -- always parallelize.


Promise.allSettled -- When Some Failures Are OK

Promise.all is fail-fast: if any promise rejects, the whole Promise.all rejects immediately and you lose the results of the ones that succeeded. Sometimes that is wrong. Sometimes you want every result, success or failure.

// allsettled-example.js
// Imagine fetching 5 third-party APIs. If one is down, you still
// want the data from the other 4 -- not a total failure.

async function fetchAllStatuses(urls) {
  // allSettled NEVER rejects. It returns one entry per input promise:
  //   { status: 'fulfilled', value: ... }   on success
  //   { status: 'rejected',  reason: ... }  on failure
  const results = await Promise.allSettled(
    urls.map(url => fetch(url).then(r => r.json()))
  );

  const successes = [];
  const failures = [];

  results.forEach((result, i) => {
    if (result.status === 'fulfilled') {
      successes.push({ url: urls[i], data: result.value });
    } else {
      failures.push({ url: urls[i], error: result.reason.message });
    }
  });

  console.log(`Got ${successes.length} successes, ${failures.length} failures`);
  return { successes, failures };
}

const apis = [
  'https://api.example.com/users',
  'https://api.example.com/orders',
  'https://api.broken.invalid/data',  // this one will fail
];

fetchAllStatuses(apis);

Use Promise.all when all-or-nothing is the right semantic (e.g. a database transaction). Use Promise.allSettled when partial success is acceptable (e.g. fetching widgets for a dashboard).


Promise.race -- Timeouts Made Easy

Promise.race settles as soon as the first promise settles, whether it fulfills or rejects. The classic use case is adding a timeout to an operation that does not natively support one.

// race-timeout.js
const { setTimeout: sleep } = require('node:timers/promises');

// Helper that rejects after `ms` milliseconds.
function timeout(ms, message = 'Operation timed out') {
  return sleep(ms).then(() => {
    throw new Error(message);
  });
}

// Wrap any promise with a timeout using Promise.race.
async function withTimeout(promise, ms) {
  // Whichever settles first wins:
  //  - the real work fulfills -> we get its value
  //  - the timeout fires      -> we throw a Timeout error
  return Promise.race([promise, timeout(ms)]);
}

async function slowFetch() {
  // Pretend this is a real network call that sometimes hangs.
  await sleep(5000);
  return { data: 'finally arrived' };
}

async function main() {
  try {
    // Give slowFetch only 1 second to complete.
    const result = await withTimeout(slowFetch(), 1000);
    console.log('Got:', result);
  } catch (err) {
    console.error('Failed:', err.message);  // "Operation timed out"
  }
}

main();

A close cousin is Promise.any, which settles on the first fulfillment (ignoring rejections) and only rejects if every input rejects. Use Promise.any when you have multiple sources for the same data and you want the fastest successful response -- for example, racing three CDN mirrors.


util.promisify -- Modernizing Legacy Callbacks

Not every API in the Node.js ecosystem has a promise-based version. Older libraries and your own legacy code may still use the (err, result) callback convention. util.promisify converts them in one line.

// promisify-example.js
const util = require('node:util');
const fs = require('node:fs');
const dns = require('node:dns');

// Convert a callback-style function into a promise-returning function.
// Requirement: the callback must follow the (err, result) convention.
const readFileAsync = util.promisify(fs.readFile);
const lookupAsync = util.promisify(dns.lookup);

async function main() {
  // Now you can await these, even though fs.readFile and dns.lookup
  // are originally callback-based.
  const data = await readFileAsync('./package.json', 'utf8');
  console.log('package.json length:', data.length);

  const { address } = await lookupAsync('nodejs.org');
  console.log('nodejs.org ->', address);
}

// Custom callback functions work too:
function legacyApi(input, callback) {
  setTimeout(() => {
    if (!input) return callback(new Error('input required'));
    callback(null, input.toUpperCase());
  }, 100);
}

const legacyAsync = util.promisify(legacyApi);

legacyAsync('hello').then(result => console.log(result));  // "HELLO"

main();

For modules that follow the standard (err, result) callback signature, promisify is a one-line upgrade. For modules with non-standard callbacks (multiple results, success-first instead of error-first), you need to write a manual wrapper around new Promise().


Error Handling and Unhandled Rejections

Errors thrown inside an async function become rejected promises. You handle them with try/catch -- the same way you handle synchronous exceptions.

// error-handling.js
const fs = require('node:fs/promises');

async function loadAndParse(path) {
  try {
    const text = await fs.readFile(path, 'utf8');
    return JSON.parse(text);  // can throw SyntaxError
  } catch (err) {
    // ENOENT (file missing), EACCES (permission), or SyntaxError (bad JSON)
    // all land here. You can branch on err.code if you need to.
    if (err.code === 'ENOENT') {
      console.warn(`Config not found at ${path}, using defaults.`);
      return { defaults: true };
    }
    // Re-throw anything we cannot handle so the caller knows.
    throw err;
  }
}

// THE DANGER: forgetting to handle a rejection.
// In modern Node.js (>= 15), an unhandled promise rejection
// crashes the entire process by default.
async function dangerous() {
  // No try/catch, no .catch(), no await with handler -> crash.
  fs.readFile('./does-not-exist.json');  // missing await is even worse
}

// Safety net: a global handler for anything that slipped through.
process.on('unhandledRejection', (reason, promise) => {
  console.error('Unhandled rejection:', reason);
  // In production: log to your error tracker, then exit gracefully.
  process.exit(1);
});

loadAndParse('./config.json').catch(err => {
  console.error('Fatal:', err);
  process.exit(1);
});

Two rules: (1) every top-level promise must end with either an await inside a try/catch or a .catch() handler. (2) Treat process.on('unhandledRejection') as a last-resort safety net for logging, not as your primary error handling.


Common Mistakes

1. await inside a for loop when the iterations are independent. This is the headline bug. If await fetchUser(id) does not depend on the previous iteration's result, you are paying a serial cost for nothing. Replace the loop with await Promise.all(ids.map(fetchUser)) and watch the wall-clock time collapse. The only valid reason to await serially is when each step needs the previous step's output.

2. Forgetting to await a promise. Calling an async function without await returns a pending promise that runs in the background. Errors from it become unhandled rejections. The function appears to "work" because the next line runs immediately, but you have lost the result and any error. Add the await -- or explicitly .catch() if you really mean to fire-and-forget.

3. Using Promise.all when you actually need Promise.allSettled. Promise.all rejects on the first failure and discards every successful result. If you are fetching 10 widgets for a dashboard and one API is down, Promise.all gives you nothing. Promise.allSettled gives you 9 widgets and one error. Pick the one that matches your failure semantics.

4. Mixing callbacks and promises in the same flow. Calling a callback API from inside an async function without wrapping it loses the error path -- the callback's err argument never makes it back into the promise chain. Always promisify legacy APIs at the boundary, then use only awaits inside your async functions.

5. Catching too broadly and swallowing errors. A try/catch that wraps 50 lines and just console.logs the error hides bugs. Either handle the error meaningfully (retry, return a default, transform it) or let it propagate. Logging-and-continuing is almost always wrong.


Interview Questions

1. "What is the difference between awaiting in a for loop and using Promise.all? When would each be correct?"

await inside a for loop runs each iteration sequentially -- iteration N+1 does not start until iteration N has fully resolved. Total wall time is the sum of all iterations. Promise.all starts every operation immediately and waits for all of them to settle in parallel. Total wall time is the time of the slowest single operation. The serial form is correct when each iteration depends on the previous one's result -- for example, paginating an API where the next request needs a cursor returned by the previous one. The parallel form is correct when the operations are independent -- reading multiple unrelated files, fetching multiple unrelated URLs, or running multiple unrelated database queries. The mistake is reaching for a for loop out of habit when independence allows parallelism, which is one of the most common Node.js performance bugs in the wild.

2. "What is the difference between Promise.all, Promise.allSettled, Promise.race, and Promise.any?"

Promise.all waits for every input to fulfill and rejects immediately on the first rejection -- all-or-nothing semantics. Promise.allSettled waits for every input to settle (fulfill or reject) and never rejects itself; it returns an array of {status, value} or {status, reason} objects so you can handle partial success. Promise.race settles -- fulfilled or rejected -- as soon as the first input settles, whatever its outcome; classic use is adding a timeout. Promise.any settles on the first fulfillment, ignoring rejections, and only rejects with an AggregateError if every input rejects; classic use is racing redundant data sources for the fastest success. Picking the right combinator is a clarity-of-thought question, not a performance one.

3. "How does async/await actually work under the hood? Is it different from promises?"

async/await is syntactic sugar over promises -- there is no separate runtime mechanism. An async function always returns a promise, regardless of what you return from it. The compiler transforms each await expression into the equivalent of a .then() callback that resumes the function with the awaited value. When the function hits an await, it suspends, control returns to the event loop, and other tasks run. When the awaited promise resolves, the function is resumed where it left off. Errors from rejected promises become thrown exceptions inside the function, which is why try/catch works around await. The semantics are identical to a .then() chain; only the syntax is friendlier.

4. "What happens when a promise rejects and nothing handles it? How has this changed across Node.js versions?"

In Node.js 14 and earlier, an unhandled promise rejection only printed a warning and the process kept running -- a notorious source of silent bugs. Starting with Node.js 15, the default --unhandled-rejections mode changed to throw, which crashes the process on any unhandled rejection (after emitting the unhandledRejection event). This is now the recommended behavior because a rejected promise represents an unrecoverable error in your code -- swallowing it leaves the process in an inconsistent state. You should always end every top-level promise chain with await inside try/catch, or with an explicit .catch() handler. The process.on('unhandledRejection') event should be a last-resort logging hook, not your primary error strategy.

5. "When and why would you use util.promisify instead of writing a manual promise wrapper?"

util.promisify converts any function that follows the Node.js (err, result) callback convention into a promise-returning function. Its strengths are conciseness (one line vs ten), correctness (it handles edge cases like multiple callback invocations), and the special util.promisify.custom symbol that lets module authors provide their own promise version. Use it for any standard callback-style API -- it is how the older parts of the Node.js core were designed to be modernized before fs.promises and friends existed. You need a manual new Promise() wrapper only when the callback signature is non-standard: success-first callbacks, multiple result arguments, EventEmitter-based APIs, or callbacks that fire more than once. For the common case, util.promisify is faster to write, easier to read, and harder to get wrong.


Quick Reference -- Promise Combinators Cheat Sheet

+---------------------------------------------------------------+
|           PROMISE COMBINATORS CHEAT SHEET                     |
+---------------------------------------------------------------+
|                                                                |
|  Promise.all([p1, p2, p3])                                     |
|    Fulfills: when ALL fulfill -> [v1, v2, v3]                  |
|    Rejects:  on FIRST rejection (fail-fast)                    |
|    Use for:  parallel work where any failure = total failure  |
|                                                                |
|  Promise.allSettled([p1, p2, p3])                              |
|    Fulfills: when ALL settle (always fulfills)                 |
|    Returns:  [{status, value|reason}, ...]                     |
|    Use for:  partial success (dashboards, batch jobs)          |
|                                                                |
|  Promise.race([p1, p2, p3])                                    |
|    Settles:  on FIRST settlement (fulfill OR reject)           |
|    Use for:  timeouts, cancellation, "first one wins"          |
|                                                                |
|  Promise.any([p1, p2, p3])                                     |
|    Fulfills: on FIRST fulfillment (ignores rejections)         |
|    Rejects:  only if ALL reject (AggregateError)               |
|    Use for:  redundant sources, fastest mirror wins            |
|                                                                |
+---------------------------------------------------------------+

+---------------------------------------------------------------+
|           SERIAL vs PARALLEL DECISION                         |
+---------------------------------------------------------------+
|                                                                |
|  Does iteration N+1 need data from iteration N?                |
|                                                                |
|    YES -> serial: for (const x of xs) await step(x)            |
|    NO  -> parallel: await Promise.all(xs.map(step))            |
|                                                                |
|  When in doubt, ask: "could these run at the same time         |
|  without affecting each other?" If yes, parallelize.           |
|                                                                |
+---------------------------------------------------------------+

+---------------------------------------------------------------+
|           PROMISE-BASED CORE MODULES                          |
+---------------------------------------------------------------+
|                                                                |
|  require('node:fs/promises')        -> readFile, writeFile... |
|  require('node:dns/promises')       -> resolve4, lookup...    |
|  require('node:timers/promises')    -> setTimeout, setImmed.. |
|  require('node:stream/promises')    -> pipeline, finished     |
|  require('node:readline/promises')  -> question (interactive) |
|                                                                |
|  Legacy callback API? -> util.promisify(fn)                    |
|                                                                |
+---------------------------------------------------------------+
CombinatorSettles WhenReturnsRejects When
Promise.allAll fulfillArray of valuesFirst rejection
Promise.allSettledAll settleArray of {status, value/reason}Never
Promise.raceFirst settlesFirst value or throws first reasonFirst rejection
Promise.anyFirst fulfillsFirst valueAll rejected (AggregateError)

Prev: Lesson 5.1 -- The Callback Pattern Next: Lesson 5.3 -- Error Handling Strategy


This is Lesson 5.2 of the Node.js Interview Prep Course -- 10 chapters, 42 lessons.

On this page