Skip to content

Dynamic Imports and Code Splitting

intermediate17 min read

Stop Shipping Code Your Users Don't Need

Picture this: a user visits your homepage. Their browser downloads JavaScript for the homepage, the dashboard, the settings page, the admin panel, the analytics view, the user profile editor, and the 404 page. They only wanted the homepage. They just paid for 7 pages worth of JavaScript to see 1.

Code splitting fixes this by breaking your monolithic bundle into smaller chunks loaded on demand. The user downloads code for the page they're on, and everything else loads when (and if) they navigate there.

Mental Model

Think of code splitting like chapters in an e-book vs. printing the entire encyclopedia. A monolithic bundle is like downloading the entire encyclopedia before you can read chapter 1. Code splitting downloads just the chapter you're reading, and pre-downloads the next chapter while you're still reading the current one. You get fast access to what you need now without waiting for everything.

The import() Expression

Dynamic import() is the mechanism that makes code splitting possible. It's an ES2020 feature that loads a module asynchronously and returns a Promise:

// Static import — bundled into the main chunk at build time
import { HeavyChart } from './components/HeavyChart';

// Dynamic import — creates a separate chunk, loaded on demand
const { HeavyChart } = await import('./components/HeavyChart');

When the bundler (webpack, Vite, Rollup) encounters a dynamic import(), it creates a split point: everything reachable from that import goes into a separate chunk file. At runtime, calling import() triggers an HTTP request for that chunk.

Build output:
  main-abc123.js          → 85KB  (shell, routing, shared components)
  chunk-dashboard-def.js  → 62KB  (dashboard page + its dependencies)
  chunk-settings-ghi.js   → 28KB  (settings page)
  chunk-chart-jkl.js      → 95KB  (charting library + wrapper)
Quiz
What happens at the bundler level when it sees const mod = await import('./HeavyModule')?

Route-Based Splitting (Next.js Automatic)

The highest-impact splitting strategy. Each route gets its own chunk, loaded only when the user navigates to that route.

In Next.js App Router, this happens automatically. Each page.tsx in the app/ directory becomes a separate chunk:

app/
  page.tsx           → main chunk
  dashboard/
    page.tsx         → dashboard chunk (loaded on navigation)
  settings/
    page.tsx         → settings chunk
  admin/
    page.tsx         → admin chunk

You don't need to configure anything. Next.js uses its built-in code splitting to create route-level chunks automatically.

For non-Next.js React apps with React Router:

import { lazy, Suspense } from 'react';

const Dashboard = lazy(() => import('./pages/Dashboard'));
const Settings = lazy(() => import('./pages/Settings'));
const Admin = lazy(() => import('./pages/Admin'));

function App() {
  return (
    <Suspense fallback={<LoadingSkeleton />}>
      <Routes>
        <Route path="/dashboard" element={<Dashboard />} />
        <Route path="/settings" element={<Settings />} />
        <Route path="/admin" element={<Admin />} />
      </Routes>
    </Suspense>
  );
}

React.lazy() wraps the dynamic import and works with Suspense to show a fallback while the chunk loads.

Component-Level Splitting

Route-level splitting is the minimum. Component-level splitting goes further — heavy components within a page are loaded on demand.

import { lazy, Suspense } from 'react';

const ChartDashboard = lazy(() => import('./components/ChartDashboard'));
const CodeEditor = lazy(() => import('./components/CodeEditor'));
const MarkdownPreview = lazy(() => import('./components/MarkdownPreview'));

function EditorPage() {
  const [showPreview, setShowPreview] = useState(false);

  return (
    <div>
      <Suspense fallback={<EditorSkeleton />}>
        <CodeEditor />
      </Suspense>

      {showPreview && (
        <Suspense fallback={<PreviewSkeleton />}>
          <MarkdownPreview />
        </Suspense>
      )}
    </div>
  );
}

MarkdownPreview doesn't load until the user clicks "Show Preview." If they never click it, they never download that chunk. For a component that imports a heavy markdown parsing library (50KB+), this saves real bandwidth.

What to Split

Split components that are:

  • Heavy — imports large dependencies (chart libraries, code editors, rich text editors, PDF renderers)
  • Conditional — only shown under certain conditions (modals, expanded sections, admin panels)
  • Below the fold — not visible on initial page load (tabbed content, scrolled sections)

Don't split components that are:

  • Small — a 2KB component split creates more overhead (chunk request) than it saves
  • Always visible — if it's on every page load, splitting just adds a network request
  • Critical path — components needed for LCP (Largest Contentful Paint) should load immediately
Quiz
You split a 3KB component into its own chunk. The chunk has HTTP overhead of ~1KB (headers, request/response). What's the net effect?

Prefetching Strategies

Code splitting solves the "too much JavaScript upfront" problem but introduces a new one: the user clicks a link and has to wait for the chunk to download before the page renders. Prefetching solves this by loading chunks in the background before the user needs them.

Start loading the chunk when the user hovers over a navigation link. There's usually a 200-400ms window between hover and click — enough to start (or complete) the chunk download.

function NavLink({ to, children }) {
  const prefetch = () => {
    if (to === '/dashboard') import('./pages/Dashboard');
    if (to === '/settings') import('./pages/Settings');
  };

  return (
    <Link
      to={to}
      onMouseEnter={prefetch}
      onFocus={prefetch}
    >
      {children}
    </Link>
  );
}

Next.js does this automatically with its Link component — it prefetches the linked page's chunk when the link enters the viewport.

Viewport Prefetching

Load chunks when the user scrolls a trigger element into view:

function LazySection({ importFn, fallback }) {
  const [Component, setComponent] = useState(null);
  const ref = useRef(null);

  useEffect(() => {
    const observer = new IntersectionObserver(
      ([entry]) => {
        if (entry.isIntersecting) {
          importFn().then(mod => setComponent(() => mod.default));
          observer.disconnect();
        }
      },
      { rootMargin: '200px' }
    );

    if (ref.current) observer.observe(ref.current);
    return () => observer.disconnect();
  }, [importFn]);

  return (
    <div ref={ref}>
      {Component ? <Component /> : fallback}
    </div>
  );
}

The rootMargin: '200px' starts loading 200px before the element scrolls into view — by the time the user sees it, the chunk is likely already loaded.

Idle Prefetching

Load lower-priority chunks during browser idle time:

function prefetchOnIdle(importFn) {
  if ('requestIdleCallback' in window) {
    requestIdleCallback(() => importFn());
  } else {
    setTimeout(() => importFn(), 2000);
  }
}

prefetchOnIdle(() => import('./pages/Settings'));
prefetchOnIdle(() => import('./pages/Profile'));

Webpack Magic Comments

Webpack supports special comments that control chunk behavior:

const Dashboard = lazy(() =>
  import(
    /* webpackChunkName: "dashboard" */
    /* webpackPrefetch: true */
    './pages/Dashboard'
  )
);
  • webpackChunkName — names the output chunk file (useful for debugging)
  • webpackPrefetch — adds <link rel="prefetch"> to the document head, loading the chunk during browser idle time
  • webpackPreload — adds <link rel="preload">, loading the chunk immediately in parallel with the current chunk
Quiz
You add /* webpackPrefetch: true */ to a dynamic import. What does webpack generate in the HTML?

Measuring Split Effectiveness

Splitting without measuring is guessing. Here's how to verify your splits are actually helping:

1. Check the Coverage Tab in DevTools

Chrome DevTools → Sources → Coverage (or Ctrl+Shift+P → "Show Coverage"). This shows how much of each loaded JavaScript file is actually executed:

File                        Size     Used    Unused
main.js                    120KB     95KB    25KB (79% used)
chunk-dashboard.js          62KB     58KB     4KB (93% used)
vendor-charts.js            95KB     12KB    83KB (12% used) ← problem!

If a chunk has low usage percentage, it might be:

  • Too coarse — split it further
  • Loaded too early — defer it
  • Including unused dependencies — check for barrel file issues

2. Compare Initial Load Size

Before splitting:   main.js = 450KB (everything in one chunk)
After splitting:    main.js = 120KB, loaded on demand = 330KB across 8 chunks

Initial load: 450KB → 120KB = 73% reduction

3. Monitor Core Web Vitals

Track LCP (Largest Contentful Paint) and INP (Interaction to Next Paint) before and after splitting. If LCP improves but interactions feel sluggish (because chunks are loading on demand), you may need to adjust your prefetching strategy.

Avoiding Over-Splitting

More splits aren't always better. Each split adds:

  • An HTTP request (latency)
  • Chunk metadata in the main bundle (bytes)
  • Complexity in loading orchestration

Signs of over-splitting:

  • Dozens of tiny chunks (under 10KB each)
  • Waterfall loading chains (chunk A loads chunk B loads chunk C)
  • Visible loading delays on common user paths
  • Coverage tab shows most chunks at 90%+ usage (nothing to gain from splitting further)

The sweet spot: split at natural boundaries (routes, heavy components, conditional features) and prefetch aggressively.

What developers doWhat they should do
Splitting every component into its own chunk
Tiny chunks have more HTTP overhead than they save. The bundler's minSize threshold exists for a reason. Split at meaningful boundaries, not arbitrarily.
Only split heavy (20KB+), conditional, or below-fold components
Code splitting without prefetching
Without prefetching, every navigation requires downloading a chunk, creating visible delays. Prefetching loads chunks in the background so they're ready when needed.
Combine splitting with hover, viewport, or idle prefetching
Using webpackPreload instead of webpackPrefetch for next-page chunks
Preload downloads at high priority immediately — overusing it competes with critical resources for the current page. Prefetch downloads at low priority during idle time, which is appropriate for chunks the user might navigate to.
Use prefetch for anticipated navigation, preload for resources needed on the current page
Wrapping every lazy component in its own Suspense boundary
Too many independent Suspense boundaries create jarring UX — multiple loading spinners appearing and disappearing at different times. Group components that load together under one boundary for a smoother experience.
Group related lazy components under shared Suspense boundaries
Key Rules
  1. 1Dynamic import() creates split points — the bundler extracts everything reachable from that import into a separate chunk loaded on demand.
  2. 2Route-based splitting is the highest-impact strategy. In Next.js, it's automatic. In React Router, use React.lazy().
  3. 3Split components that are heavy (20KB+), conditional (modals, toggles), or below the fold. Don't split small or always-visible components.
  4. 4Always pair code splitting with prefetching — hover, viewport, idle, or link rel='prefetch'. Loading without prefetching creates visible delays.
  5. 5Measure split effectiveness with DevTools Coverage tab, initial load size comparison, and Core Web Vitals monitoring.
  6. 6Avoid over-splitting: dozens of tiny chunks cause more overhead than they save. Split at natural boundaries.