Skip to content

Cache API & Service Worker Strategies

advanced22 min read

Why the Network is the Enemy

Your app works great on your MacBook Pro with fiber internet. Then a user opens it on a train going through a tunnel. Or in a building with one bar of signal. Or on a flight. The network is the single most unreliable dependency your app has — and without a strategy for handling that, your app is useless the moment connectivity drops.

Service workers sit between your app and the network. They intercept every fetch request and let you decide: should this come from the network, from a local cache, or both? The Cache API is the storage mechanism purpose-built for this — it stores complete HTTP request/response pairs and matches them by URL.

Together, they are how you build apps that work offline, load instantly on repeat visits, and degrade gracefully when the network is flaky.

Mental Model

Think of a service worker as a programmable proxy server that lives inside the browser. Every network request your app makes passes through this proxy first. You write the rules: "If the user asks for this page and I have a cached copy, serve the cache instantly. If not, fetch from the network and save a copy for next time." The Cache API is the proxy's local storage — a filing cabinet of HTTP responses, organized by URL, that the proxy checks before forwarding requests to the real network.

The Service Worker Lifecycle

Understanding the lifecycle is non-negotiable. Service workers do not install and activate instantly — they go through a precise sequence, and getting this wrong causes bugs that are infuriatingly hard to debug.

Registration

if ("serviceWorker" in navigator) {
  navigator.serviceWorker.register("/sw.js", { scope: "/" })
    .then(reg => console.log("SW registered, scope:", reg.scope))
    .catch(err => console.error("SW registration failed:", err));
}

The scope controls which pages the service worker can intercept. A service worker at /sw.js with scope / controls all pages on the origin. A service worker at /app/sw.js with default scope only controls pages under /app/.

Installation: Precaching

The install event is your chance to cache everything the app needs to work offline. This is called precaching — storing critical resources before the user needs them.

const CACHE_NAME = "app-v3";
const PRECACHE_URLS = [
  "/",
  "/index.html",
  "/styles.css",
  "/app.js",
  "/offline.html",
];

self.addEventListener("install", (event) => {
  event.waitUntil(
    caches.open(CACHE_NAME)
      .then(cache => cache.addAll(PRECACHE_URLS))
      .then(() => self.skipWaiting())
  );
});

event.waitUntil() tells the browser "do not finish installation until this promise resolves." If any precached resource fails to download, the entire installation fails — the old service worker stays active.

self.skipWaiting() activates the new service worker immediately instead of waiting for all tabs to close. Use it when you want instant updates, but be careful — it means the new service worker may serve pages that were loaded with the old version's assets.

Activation: Cache Cleanup

The activate event is where you delete old caches. Without this, cache versions accumulate and eat storage quota.

self.addEventListener("activate", (event) => {
  event.waitUntil(
    caches.keys()
      .then(names => Promise.all(
        names
          .filter(name => name !== CACHE_NAME)
          .map(name => caches.delete(name))
      ))
      .then(() => self.clients.claim())
  );
});

self.clients.claim() makes the newly activated service worker take control of all open tabs immediately, instead of waiting for the next navigation.

Quiz
A user has your app open in two tabs. You deploy a new service worker. What happens if the new SW does NOT call skipWaiting()?

The Cache API

The Cache API is a simple key-value store where keys are Request objects (or URL strings) and values are Response objects.

const cache = await caches.open("api-data-v1");

// Store a response
await cache.put("/api/products", new Response(
  JSON.stringify(products),
  { headers: { "Content-Type": "application/json" } }
));

// Retrieve a response
const response = await cache.match("/api/products");
if (response) {
  const data = await response.json();
}

// Store by fetching
await cache.add("/api/categories"); // fetches and caches in one call

// Delete
await cache.delete("/api/products");

Cache Matching

cache.match() matches by URL by default. You can control matching with options:

// Ignore query string — /api/data?v=1 and /api/data?v=2 match the same entry
const response = await cache.match(url, { ignoreSearch: true });

// Ignore HTTP method — match GET and POST to the same URL
const response = await cache.match(request, { ignoreMethod: true });

// Search all caches, not just one
const response = await caches.match(url); // searches every named cache
Quiz
You cache /api/products?page=1. Later you call cache.match('/api/products?page=2'). Does it return a result?

Caching Strategies

This is where strategy meets implementation. Each pattern has trade-offs between freshness, speed, and offline capability.

1. Cache First (Cache Falling Back to Network)

Check the cache first. If the resource is cached, serve it immediately. If not, fetch from the network and cache the response.

Best for: Static assets (CSS, JS, images, fonts) that change infrequently and are versioned by filename.

self.addEventListener("fetch", (event) => {
  event.respondWith(
    caches.match(event.request)
      .then(cached => {
        if (cached) return cached;

        return fetch(event.request).then(response => {
          if (!response || response.status !== 200 || response.type !== "basic") {
            return response;
          }

          const clone = response.clone();
          caches.open(CACHE_NAME).then(cache => cache.put(event.request, clone));
          return response;
        });
      })
  );
});

Why clone? A Response body can only be consumed once. If you pass the response to the cache, you cannot also return it to the page. response.clone() creates a copy — one for the cache, one for the page.

2. Network First (Network Falling Back to Cache)

Try the network first. If it succeeds, cache the fresh response and return it. If the network fails (offline, timeout), fall back to the cached version.

Best for: API data, dynamic content that should be fresh when possible but available offline.

self.addEventListener("fetch", (event) => {
  event.respondWith(
    fetch(event.request)
      .then(response => {
        const clone = response.clone();
        caches.open(CACHE_NAME).then(cache => cache.put(event.request, clone));
        return response;
      })
      .catch(() => caches.match(event.request))
  );
});

3. Stale-While-Revalidate

Serve the cached version immediately for instant response, then fetch from the network in the background and update the cache. The user sees stale data this time but fresh data next time.

Best for: Data that is acceptable to be slightly stale — user avatars, product listings, article content, non-critical API responses.

self.addEventListener("fetch", (event) => {
  event.respondWith(
    caches.open(CACHE_NAME).then(cache =>
      cache.match(event.request).then(cached => {
        const networkFetch = fetch(event.request).then(response => {
          cache.put(event.request, response.clone());
          return response;
        });

        return cached || networkFetch;
      })
    )
  );
});

4. Network Only

Always go to the network. No caching at all.

Best for: Non-idempotent requests (POST, PUT, DELETE), real-time data (stock prices, chat messages), authentication endpoints.

self.addEventListener("fetch", (event) => {
  event.respondWith(fetch(event.request));
});

5. Cache Only

Only serve from cache. Never hit the network.

Best for: Precached resources that you know will always be in the cache (app shell files added during install).

self.addEventListener("fetch", (event) => {
  event.respondWith(caches.match(event.request));
});

Strategy Selection Guide

StrategySpeedFreshnessOfflineBest For
Cache FirstInstant (if cached)Stale until cache bustedYesVersioned static assets
Network FirstNetwork speedAlways fresh (if online)Yes (fallback)API data, dynamic pages
Stale-While-RevalidateInstant (if cached)Fresh on next visitYesAvatars, listings, articles
Network OnlyNetwork speedAlways freshNoPOST/PUT, real-time, auth
Cache OnlyInstantStale foreverYesPrecached app shell
Quiz
Your PWA displays a product catalog. Users expect up-to-date prices but the app must work offline. Which caching strategy is most appropriate?

Advanced Patterns

Routing by Request Type

In production, you do not use one strategy for everything. Different resources get different strategies:

self.addEventListener("fetch", (event) => {
  const url = new URL(event.request.url);

  if (url.origin !== location.origin) return;

  if (event.request.destination === "image") {
    event.respondWith(cacheFirst(event.request, "images-v1"));
    return;
  }

  if (url.pathname.startsWith("/api/")) {
    event.respondWith(networkFirst(event.request, "api-v1"));
    return;
  }

  if (event.request.mode === "navigate") {
    event.respondWith(networkFirst(event.request, "pages-v1"));
    return;
  }

  event.respondWith(cacheFirst(event.request, "static-v1"));
});

async function cacheFirst(request, cacheName) {
  const cache = await caches.open(cacheName);
  const cached = await cache.match(request);
  if (cached) return cached;

  const response = await fetch(request);
  if (response.ok) cache.put(request, response.clone());
  return response;
}

async function networkFirst(request, cacheName) {
  const cache = await caches.open(cacheName);
  try {
    const response = await fetch(request);
    if (response.ok) cache.put(request, response.clone());
    return response;
  } catch {
    const cached = await cache.match(request);
    return cached || new Response("Offline", { status: 503 });
  }
}

Network Timeout Fallback

Network First is slow on bad connections. Add a timeout to fall back to cache faster:

async function networkFirstWithTimeout(request, cacheName, timeoutMs) {
  const cache = await caches.open(cacheName);

  try {
    const response = await Promise.race([
      fetch(request),
      new Promise((_, reject) =>
        setTimeout(() => reject(new Error("timeout")), timeoutMs)
      ),
    ]);
    if (response.ok) cache.put(request, response.clone());
    return response;
  } catch {
    const cached = await cache.match(request);
    return cached || new Response("Offline", { status: 503 });
  }
}

Offline Fallback Page

When a navigation request fails and no cached version exists, show a custom offline page:

self.addEventListener("fetch", (event) => {
  if (event.request.mode === "navigate") {
    event.respondWith(
      fetch(event.request).catch(() =>
        caches.match("/offline.html")
      )
    );
  }
});
Quiz
Your service worker uses stale-while-revalidate. A user opens your app offline. They have a cached version from yesterday. What happens?

Cache Versioning and Cleanup

Without versioning, you have no way to invalidate stale caches. Without cleanup, old caches accumulate forever.

const CURRENT_CACHES = {
  static: "static-v4",
  api: "api-v2",
  images: "images-v1",
};

self.addEventListener("activate", (event) => {
  const expectedNames = new Set(Object.values(CURRENT_CACHES));

  event.waitUntil(
    caches.keys().then(names =>
      Promise.all(
        names
          .filter(name => !expectedNames.has(name))
          .map(name => caches.delete(name))
      )
    )
  );
});
What developers doWhat they should do
Not calling event.waitUntil() in install/activate handlers
The browser can terminate a service worker at any time when it is idle. Without event.waitUntil(), your cache.addAll() or caches.delete() might not finish before the browser kills the worker. The promise passed to waitUntil() keeps the worker alive until the operation completes.
Always wrap async operations in event.waitUntil() to prevent the SW from terminating mid-operation
Caching opaque responses (cross-origin without CORS) without understanding the cost
For security, browsers report opaque responses as consuming ~7MB of quota regardless of actual size. Caching 100 opaque image responses eats ~700MB of quota. Either use CORS headers on the CDN or be very selective about what you cache cross-origin.
Be aware that opaque responses are padded to ~7MB each in the storage quota
Using skipWaiting() without considering the consequences
skipWaiting() activates the new SW immediately, but existing tabs still have HTML loaded from the old version. If the new SW serves updated CSS or JS to an old HTML page, things break. Either use skipWaiting() with clients.claim() and ensure backwards compatibility, or prompt the user to refresh.
Only use skipWaiting() when you handle the case where a new SW serves a page loaded with old assets
Not cleaning up old caches in the activate handler
Without cleanup, every deployment adds a new cache version while old ones persist. On a weekly deploy cadence, you accumulate 52 caches per year, each potentially holding megabytes of data. This silently eats the user's storage quota.
Delete all caches that do not match the current version names during activation
Caching POST requests or responses with error status codes
POST requests are not idempotent — caching them can lead to replaying side effects. Error responses (404, 500) should not be cached because they would prevent the browser from fetching the correct resource when the server recovers.
Only cache GET requests with 200 status codes and basic response types

Challenge: Build a Multi-Strategy Service Worker

Challenge: Multi-Strategy SW

Try to solve it before peeking at the answer.

javascript
// Build a service worker that applies different caching strategies
// based on the request type:
//
// 1. Navigation requests (HTML pages): Network First with 3s timeout
// 2. Static assets (JS, CSS, fonts): Cache First
// 3. Images: Stale-While-Revalidate
// 4. API requests (/api/*): Network First, no timeout
// 5. Everything else: Network Only
//
// Requirements:
// - Precache the app shell during install
// - Clean up old caches during activate
// - Show /offline.html when a navigation request fails
// - Never cache POST requests

Key Rules

Key Rules
  1. 1Service workers intercept fetch requests — they are programmable proxies between your app and the network. No SW means no offline capability.
  2. 2The SW lifecycle is Register → Install → Wait → Activate → Fetch. Schema changes to caches happen in install (precache) and activate (cleanup).
  3. 3Always call event.waitUntil() in install and activate handlers. Without it, the browser may kill the worker before async operations finish.
  4. 4Clone responses before caching — a Response body can only be consumed once. One clone for the cache, the original for the page.
  5. 5Route different request types to different strategies: Cache First for static assets, Network First for API data, Stale-While-Revalidate for images and non-critical content.
  6. 6Delete old caches in the activate handler. Without cleanup, stale caches accumulate and eat quota on every deployment.
  7. 7Never cache POST requests, error responses, or opaque cross-origin responses (they cost ~7MB each in quota).