JavaScript Interview Prep
Functions Deep Dive

Memoization

Cache Your Answers, Never Recompute

LinkedIn Hook

Your recursive fibonacci recalculates the same values thousands of times.

Your "find user by id" helper hits the API for the same id on every render.

Your data transformation runs the same heavy computation on the same input, over and over, across components.

Every one of these is the same problem — and it has the same fix. Memoization. Compute once, remember the answer, return it instantly next time.

Memoized fibonacci(40) is roughly 12,000x faster than the naive version. Not "a bit faster" — twelve thousand times. And the code that gets you there is about 8 lines.

In this lesson I build a generic memoize utility from scratch, cover WeakMap-based caching for object arguments, and walk through TTL, LRU, and manual invalidation — because caching forever is a bug, not a feature.

Read the full lesson -> [link]

#JavaScript #InterviewPrep #Memoization #Performance #Caching #Frontend #WebDevelopment


Memoization thumbnail


What You'll Learn

  • What memoization is and the four conditions where it actually helps
  • How to implement a generic memoize utility with multi-argument support and a custom resolver
  • When to use WeakMap over Map — and why it matters for memory
  • Cache invalidation strategies: TTL, max-size (LRU), and manual clear

The Student-with-a-Notebook Analogy

Think of memoization like a student who writes down answers to homework problems. The first time they solve fibonacci(40), it takes work. But they write the answer in their notebook. Next time the teacher asks for fibonacci(40) — they just look it up. Instant.


What is Memoization?

Memoization is a technique where you cache the results of expensive function calls and return the cached result when the same inputs occur again.

// Without memoization — recalculates every time
function square(n) {
  console.log("Computing...");
  return n * n;
}

square(4); // "Computing..." -> 16
square(4); // "Computing..." -> 16 (computed again!)
square(4); // "Computing..." -> 16 (and again!)

// With memoization — computes once, caches forever
const memoizedSquare = memoize(square);

memoizedSquare(4); // "Computing..." -> 16
memoizedSquare(4); // 16 (from cache — instant!)
memoizedSquare(4); // 16 (from cache — instant!)

Implement a Generic Memoize Utility

// Basic memoize — single argument
function memoize(fn) {
  const cache = new Map();

  return function (arg) {
    if (cache.has(arg)) {
      return cache.get(arg);
    }

    const result = fn.call(this, arg);
    cache.set(arg, result);
    return result;
  };
}

But real-world functions often take multiple arguments. We need a better solution:

// Generic memoize — handles multiple arguments
function memoize(fn, resolver) {
  const cache = new Map();

  function memoized(...args) {
    // Create a cache key — use custom resolver or default to JSON
    const key = resolver ? resolver(...args) : JSON.stringify(args);

    if (cache.has(key)) {
      return cache.get(key);
    }

    const result = fn.apply(this, args);
    cache.set(key, result);
    return result;
  }

  // Expose cache for debugging/clearing
  memoized.cache = cache;

  return memoized;
}

// Usage with multiple args
const add = memoize((a, b) => {
  console.log("Computing...");
  return a + b;
});

add(1, 2); // "Computing..." -> 3
add(1, 2); // 3 (cached!)
add(2, 3); // "Computing..." -> 5

// Usage with custom resolver
const getUserData = memoize(
  (id, options) => fetch(`/api/users/${id}`),
  (id, options) => id // only cache by id, ignore options
);

Memoized Fibonacci vs Normal Fibonacci

This is the classic example that shows the dramatic performance difference:

// Normal recursive fibonacci — O(2^n) time
function fibonacci(n) {
  if (n <= 1) return n;
  return fibonacci(n - 1) + fibonacci(n - 2);
}

// fibonacci(40) makes ~300 million recursive calls!
console.time("Normal");
console.log(fibonacci(40)); // 102334155
console.timeEnd("Normal"); // ~1200ms

// Memoized fibonacci — O(n) time
function fibonacciMemo(n, memo = {}) {
  if (n in memo) return memo[n];
  if (n <= 1) return n;

  memo[n] = fibonacciMemo(n - 1, memo) + fibonacciMemo(n - 2, memo);
  return memo[n];
}

console.time("Memoized");
console.log(fibonacciMemo(40)); // 102334155
console.timeEnd("Memoized"); // ~0.1ms

That's roughly a 12,000x speedup for n=40. And it gets exponentially worse for larger values.

Now using our generic memoize utility:

// Using the generic memoize with fibonacci
const fib = memoize(function (n) {
  if (n <= 1) return n;
  return fib(n - 1) + fib(n - 2);
});

console.log(fib(100)); // 354224848179262000000
// Without memoization, this would take longer than the age of the universe

Memoization visual 1


WeakMap-Based Memoization for Object Arguments

JSON.stringify is slow for large objects and doesn't handle circular references. For object arguments, use WeakMap — it also allows garbage collection of unused entries:

// WeakMap-based memoize for object arguments
function memoizeWeak(fn) {
  const cache = new WeakMap();

  return function (obj) {
    if (cache.has(obj)) {
      return cache.get(obj);
    }

    const result = fn.call(this, obj);
    cache.set(obj, result);
    return result;
  };
}

// Usage
const processUser = memoizeWeak((user) => {
  console.log("Processing...");
  return { ...user, processed: true, timestamp: Date.now() };
});

const user = { name: "Rakibul", id: 1 };

processUser(user); // "Processing..." -> { name: "Rakibul", ... }
processUser(user); // From cache — instant! (same object reference)

// But a NEW object with same data = cache miss
processUser({ name: "Rakibul", id: 1 }); // "Processing..." (different reference!)

Why WeakMap?

  • Keys must be objects (perfect for object-arg memoization)
  • Keys are held weakly — when the object is garbage collected, the cache entry is too
  • No memory leaks from indefinitely growing caches

Cache Invalidation Strategies

Caching forever isn't always the right move. Here are common strategies:

// 1. Time-based expiration (TTL)
function memoizeWithTTL(fn, ttl = 60000) {
  const cache = new Map();

  return function (...args) {
    const key = JSON.stringify(args);
    const cached = cache.get(key);

    if (cached && Date.now() - cached.timestamp < ttl) {
      return cached.value;
    }

    const result = fn.apply(this, args);
    cache.set(key, { value: result, timestamp: Date.now() });
    return result;
  };
}

// Cache API results for 5 minutes
const fetchUser = memoizeWithTTL(
  (id) => fetch(`/api/users/${id}`).then((r) => r.json()),
  5 * 60 * 1000
);

// 2. Max size (LRU-style)
function memoizeWithMaxSize(fn, maxSize = 100) {
  const cache = new Map();

  return function (...args) {
    const key = JSON.stringify(args);

    if (cache.has(key)) {
      // Move to end (most recently used)
      const value = cache.get(key);
      cache.delete(key);
      cache.set(key, value);
      return value;
    }

    const result = fn.apply(this, args);

    if (cache.size >= maxSize) {
      // Delete oldest entry (first key in Map)
      const firstKey = cache.keys().next().value;
      cache.delete(firstKey);
    }

    cache.set(key, result);
    return result;
  };
}

// 3. Manual invalidation
function memoizeWithClear(fn) {
  const cache = new Map();

  function memoized(...args) {
    const key = JSON.stringify(args);
    if (cache.has(key)) return cache.get(key);

    const result = fn.apply(this, args);
    cache.set(key, result);
    return result;
  }

  memoized.clear = () => cache.clear();
  memoized.delete = (...args) => cache.delete(JSON.stringify(args));
  memoized.has = (...args) => cache.has(JSON.stringify(args));

  return memoized;
}

Common Mistakes

  • Memoizing an impure function (one with side effects, randomness, or dependence on outside state). The cache will happily return a stale result and you'll spend hours debugging "but it worked the first time".
  • Using JSON.stringify(args) as the cache key when arguments contain functions, undefined, circular references, or large nested objects — either the serialization fails, collides, or becomes slower than the function you're trying to speed up.
  • Using a plain Map for object-keyed memoization in long-running apps. Nothing ever gets evicted, the cache grows forever, and you leak memory. Use WeakMap for object keys, or add TTL/max-size for primitive keys.

Interview Questions

Q: What is memoization and when would you use it?

Memoization is an optimization technique that caches function results based on inputs. Use it when the function is pure (same inputs always produce same outputs), expensive (heavy computation or recursive calls), called repeatedly with the same arguments, and the range of inputs is bounded (otherwise cache grows unbounded).

Q: Implement memoize for a function that takes objects as arguments.

Use a WeakMap for object keys — it avoids memory leaks and is faster than serializing objects. See the memoizeWeak implementation above.

Q: What are the downsides of memoization?

Memory usage — every unique input stores a result in memory. Stale data — cached results might become outdated. Cache key generation — complex arguments (objects, functions) are hard to serialize. Only works with pure functions — memoizing impure functions (side effects, randomness) gives wrong results.

Q: Can you memoize an async function?

Yes, but you cache the Promise itself, not the resolved value:

function memoizeAsync(fn) {
  const cache = new Map();
  return function (...args) {
    const key = JSON.stringify(args);
    if (cache.has(key)) return cache.get(key);
    const promise = fn.apply(this, args).catch((err) => {
      cache.delete(key); // remove failed promises from cache
      throw err;
    });
    cache.set(key, promise);
    return promise;
  };
}

Q: What happens if you memoize an impure function?

You get stale/wrong results because memoization assumes same inputs = same outputs.

Q: What's the time complexity of memoized fibonacci?

O(n) time, O(n) space — each value is computed exactly once.

Q: What's the difference between Map and WeakMap for memoization?

Map holds strong references — the cache grows forever unless you clean it up. WeakMap holds weak references to object keys — entries are garbage collected automatically when the key object is no longer referenced elsewhere. WeakMap keys must be objects.


Quick Reference — Cheat Sheet

MEMOIZATION — QUICK MAP

  Input -> [Cache check] -> Hit?  -> Return cached
                             \
                              Miss -> Compute -> Store -> Return

Cache backends:
  Map       -> any key type, strong refs, manual cleanup
  WeakMap   -> object keys only, weak refs, auto GC
  TTL       -> time-based expiry, good for API data
  LRU       -> size-limited, evicts least recently used

Key rules:
  - Function MUST be pure (no side effects, no randomness)
  - Same inputs MUST yield same output
  - Input range SHOULD be bounded, or add TTL/LRU
  - Primitive args   -> JSON.stringify(args) or join key
  - Object args      -> WeakMap keyed on object reference
  - Async functions  -> cache the Promise, not the value

Previous: Debouncing & Throttling -> Rate-Limiting Your Event Handlers Next: Generator Functions -> Pausable Functions with yield


This is Lesson 6.8 of the JavaScript Interview Prep Course — 14 chapters, 87 lessons.

On this page