Skip to content

Concurrent Mode & Time Slicing

advanced22 min read

The Synchronous Rendering Problem

Picture this: you click a button and your app freezes for 200ms. You can't type, you can't scroll, you can't do anything. In synchronous rendering, when React starts processing an update, it runs to completion. Every component in the tree is called, every fiber is processed, every DOM mutation is applied — all in one uninterruptible block.

User clicks button
  ↓
React starts rendering (200ms of work)
  ├─ Component A renders (5ms)
  ├─ Component B renders (15ms)
  ├─ Component C renders (50ms)     ← user types in an input here
  ├─ Component D renders (80ms)     ← browser cannot respond to keystroke
  └─ Component E renders (50ms)
  ↓
DOM updated
  ↓
Browser finally processes the keystroke (200ms late)

For 200ms, the main thread is blocked. The browser cannot process input events, run animations, or paint frames. The user perceives the UI as frozen.

Mental Model

Synchronous rendering is like a single-lane highway with no exits. Once you enter, you must drive to the end — even if an ambulance (user input) needs to pass. Concurrent rendering adds exit ramps every 5ms. React can pull off the highway, let the ambulance through, and then get back on.

Concurrent Rendering: Interruptible Work

This is React's biggest architectural bet — and it paid off. Concurrent rendering breaks the render phase into small units of work (one fiber per unit) and yields to the browser between units. This is possible because Fiber's linked-list architecture allows React to pause at any fiber and resume later.

User clicks button
  ↓
React starts rendering (same 200ms total work)
  ├─ Process fibers for 5ms → yield to browser
  │   Browser: nothing pending → resume
  ├─ Process fibers for 5ms → yield to browser
  │   Browser: nothing pending → resume
  ├─ Process fibers for 5ms → yield to browser
  │   Browser: USER TYPED! → process keystroke → React resumes
  ├─ Process fibers for 5ms → yield
  │   ...continues in chunks...
  └─ All fibers processed
  ↓
DOM updated

The total render time may even increase slightly (scheduling overhead), but the user never perceives a freeze because the browser gets regular chances to respond to events and paint frames.

Time Slicing: The 5ms Budget

You might wonder why 5ms specifically. It's not arbitrary — it is calibrated to keep frame budgets intact. At 60fps, each frame has 16.7ms. If React uses 5ms, the browser has 11.7ms for layout, paint, compositing, and event processing.

// From React's SchedulerFeatureFlags
const frameYieldMs = 5;

// The shouldYield function (simplified)
function shouldYield() {
  const currentTime = getCurrentTime();
  return currentTime >= deadline; // deadline = startTime + 5ms
}

// The work loop checks between every fiber
function workLoopConcurrent() {
  while (workInProgress !== null && !shouldYield()) {
    performUnitOfWork(workInProgress);
  }
}

When shouldYield() returns true, React stops processing fibers. It schedules a continuation via MessageChannel (not setTimeout — which has a minimum 4ms delay in browsers) and returns control to the browser's event loop.

// React uses MessageChannel for scheduling, not setTimeout
const channel = new MessageChannel();
channel.port1.onmessage = performWorkUntilDeadline;

function requestHostCallback(callback) {
  scheduledCallback = callback;
  channel.port2.postMessage(null); // Schedule on next microtask-adjacent timing
}
Quiz
Why does React use MessageChannel instead of setTimeout(fn, 0) for scheduling work chunks?

Priority Lanes

Here's the thing most people miss about concurrent React: it's not just about yielding — it's about prioritizing. Not all updates are equally important. React assigns each update a "lane" — a priority level encoded as a bitmask.

Lane Hierarchy (highest to lowest priority):

SyncLane              → Discrete user input (click, keypress)
InputContinuousLane   → Continuous input (drag, scroll, mousemove)
DefaultLane           → Normal updates (setState from fetch, initial render)
TransitionLane(s)     → startTransition updates (16 transition lanes available)
RetryLane(s)          → Suspense retries after promise resolution
IdleLane              → Offscreen, prefetch, low-priority background work

How Lanes Work as Bitmasks

// Lanes are powers of 2 — each occupies one bit position
const SyncLane           = 0b0000000000000000000000000000001;
const InputContinuousLane = 0b0000000000000000000000000000100;
const DefaultLane        = 0b0000000000000000000000000010000;
const TransitionLane1    = 0b0000000000000000000001000000000;
const TransitionLane2    = 0b0000000000000000000010000000000;
const IdleLane           = 0b0100000000000000000000000000000;

// Merge lanes: which priorities have pending work?
const pendingLanes = SyncLane | TransitionLane1;
// = 0b0000000000000000000001000000001

// Check: is there sync work pending?
if (pendingLanes & SyncLane) {
  // Yes — process sync lane first
}

// Find highest priority pending lane:
const nextLane = pendingLanes & -pendingLanes; // Isolates lowest set bit
// = SyncLane (highest priority = lowest bit value)

Lane Assignment

React assigns lanes based on the context of the update:

Update SourceLane Assigned
onClick handler calling setStateSyncLane
onScroll handler calling setStateInputContinuousLane
fetch().then(() => setState())DefaultLane
startTransition(() => setState())TransitionLane
useDeferredValue internal updateTransitionLane
Offscreen/hidden contentIdleLane
Quiz
A component receives a click event (SyncLane) and a startTransition update (TransitionLane) in the same event handler. Which runs first?

useTransition

Now let's see how you actually use all of this. useTransition lets you mark a state update as non-urgent. React processes it at TransitionLane priority — interruptible and yieldable.

function SearchPage() {
  const [query, setQuery] = useState('');
  const [results, setResults] = useState([]);
  const [isPending, startTransition] = useTransition();

  function handleChange(e) {
    const value = e.target.value;
    setQuery(value);  // Urgent: update input immediately (SyncLane)

    startTransition(() => {
      setResults(filterLargeDataset(value));  // Non-urgent: can be interrupted
    });
  }

  return (
    <div>
      <input value={query} onChange={handleChange} />
      {isPending && <Spinner />}
      <ResultsList results={results} />
    </div>
  );
}

What happens when the user types "abc" quickly:

  1. Type "a": setQuery("a") at SyncLane — input updates instantly. startTransition schedules setResults(filterLargeDataset("a")) at TransitionLane.
  2. Type "b" before transition finishes: React interrupts the "a" transition. setQuery("ab") at SyncLane — input updates instantly. New transition for "ab" replaces "a".
  3. Type "c": Same pattern. Input always reflects latest keystroke. Only the final "abc" transition completes.

The user sees instant input feedback. The expensive list rendering happens in the background and is automatically debounced by interruption. No debounce utility needed — React handles it natively through interruption.

How interruption works internally

When React is in the middle of rendering the TransitionLane tree and a SyncLane update arrives:

  1. The work loop's shouldYield() returns true (higher priority work detected).
  2. React sets workInProgress = null, discarding the partially-built transition tree.
  3. React processes the SyncLane update: full render → commit → DOM update.
  4. The browser handles events and paints.
  5. React restarts the transition render from the root with the latest state (including the new SyncLane state).

The interrupted transition work is thrown away. This is safe because the render phase has no side effects — no DOM was modified. Fiber's architecture makes the cost of abandoning work proportional to the work done, not the total tree size.

useDeferredValue

useDeferredValue defers an individual value rather than wrapping a state update. It returns a copy of the value that "lags behind" — React first renders with the old deferred value, then re-renders in the background with the new value.

function SearchResults({ query }) {
  const deferredQuery = useDeferredValue(query);
  const isStale = query !== deferredQuery;

  // First render: deferredQuery is still the old value (fast, reuses cached render)
  // Background render: deferredQuery updates to new value (interruptible)
  const results = useMemo(() => filterResults(deferredQuery), [deferredQuery]);

  return (
    <div style={{ opacity: isStale ? 0.7 : 1 }}>
      <ResultsList results={results} />
    </div>
  );
}

useTransition vs useDeferredValue

useTransitionuseDeferredValue
ControlsThe state update (wraps setState)The consumed value (wraps the value)
When to useYou own the state updateYou receive a value as a prop (can't control how it's set)
Returns[isPending, startTransition]The deferred value
Who decides priorityThe update producerThe update consumer
Quiz
A parent component passes a query prop that changes on every keystroke. The child component does expensive filtering based on query. The child cannot modify how the parent sets state. Which hook should the child use?

How React Decides What to Interrupt

You might think React is making ad-hoc decisions about when to interrupt. It's not. The decision follows specific rules:

  1. Only render-phase work can be interrupted. The commit phase (DOM mutations) is always synchronous.
  2. Only lower-priority work yields to higher-priority work. TransitionLane yields to SyncLane. SyncLane never yields — it runs to completion.
  3. Interruption granularity is one fiber. React checks shouldYield() after processing each fiber, not mid-fiber.
  4. Interrupted work is discarded, not paused. The partially-built workInProgress tree is abandoned. React starts over with the latest state.
Priority escalation example:

t=0ms:   TransitionLane render starts
t=3ms:   Processing fiber #47... shouldYield()? No (< 5ms)
t=5ms:   Processing fiber #52... shouldYield()? Yes (≥ 5ms)
           → React yields to browser
t=5.1ms: Browser processes queued events
           → User clicked a button! setState at SyncLane
t=5.2ms: React detects SyncLane work pending
           → Abandons transition render (workInProgress = null)
           → Processes SyncLane render (uninterruptible)
t=8ms:   SyncLane commit complete, DOM updated
t=8.1ms: React restarts TransitionLane render from scratch

Concurrent Features Dependency Chain

All concurrent features are built on Fiber + Lanes:

Fiber (linked-list, interruptible traversal)
  ↓
Work Loop (process one fiber, check shouldYield)
  ↓
Lanes (assign priority, determine render order)
  ↓
Built on these primitives:
  ├── useTransition (mark update as TransitionLane)
  ├── useDeferredValue (split value into sync + deferred render)
  ├── Suspense (throw promise → show fallback → retry at same lane)
  ├── React.lazy (code-split components + Suspense)
  └── startTransition (lower-level API, no isPending state)
Quiz
During a concurrent render at TransitionLane priority, React has processed 200 fibers out of 500. A SyncLane update arrives. What happens to the 200 already-processed fibers?

Practical Patterns

Keeping an Input Responsive

function FilterableList({ items }) {
  const [text, setText] = useState('');
  const [isPending, startTransition] = useTransition();

  const [filteredItems, setFilteredItems] = useState(items);

  function handleChange(e) {
    setText(e.target.value);
    startTransition(() => {
      setFilteredItems(items.filter(item =>
        item.name.toLowerCase().includes(e.target.value.toLowerCase())
      ));
    });
  }

  return (
    <>
      <input value={text} onChange={handleChange} />
      {isPending ? <Spinner /> : null}
      {filteredItems.map(item => <ListItem key={item.id} item={item} />)}
    </>
  );
}

Tab Switching Without Jank

function Dashboard() {
  const [tab, setTab] = useState('overview');
  const [isPending, startTransition] = useTransition();

  function selectTab(nextTab) {
    startTransition(() => {
      setTab(nextTab);
    });
  }

  return (
    <div>
      <TabBar selected={tab} onSelect={selectTab} isPending={isPending} />
      <Suspense fallback={<TabSkeleton />}>
        {tab === 'overview' && <Overview />}
        {tab === 'analytics' && <Analytics />}   {/* Expensive */}
        {tab === 'reports' && <Reports />}        {/* Expensive */}
      </Suspense>
    </div>
  );
}

If the user clicks "Analytics" and then quickly clicks "Reports" before Analytics finishes rendering, React interrupts the Analytics transition and starts rendering Reports. The user never sees a half-rendered Analytics tab.

Common Trap

Concurrent rendering is not parallel rendering. React still runs on the main thread. "Concurrent" means React can interleave multiple render passes (yielding between them), not that they run simultaneously. There is exactly one fiber being processed at any moment. The concurrency is cooperative — React voluntarily yields, the browser gets control, then React resumes.

Key Rules
  1. 1Concurrent rendering breaks work into 5ms chunks. Between chunks, the browser can process events and paint frames.
  2. 2Lanes assign priority: SyncLane (clicks) > InputContinuousLane (drag) > DefaultLane > TransitionLane > IdleLane.
  3. 3Higher-priority updates interrupt lower-priority renders. Interrupted work is discarded and restarted.
  4. 4useTransition wraps setState to mark it as non-urgent. Use when you own the state update.
  5. 5useDeferredValue defers a received value. Use when you consume a value you don't control.
  6. 6Only the render phase is interruptible. The commit phase (DOM mutations) is always synchronous.
  7. 7Concurrent mode is cooperative, single-threaded concurrency — not parallelism.
Interview Question

Q: How does React's concurrent rendering keep the UI responsive without Web Workers or multiple threads?

A strong answer covers: Fiber's linked-list structure enabling pause/resume (no call stack dependency), the work loop yielding every 5ms via shouldYield(), MessageChannel-based scheduling (not setTimeout), lane-based priority system where user input (SyncLane) interrupts background rendering (TransitionLane), abandoned in-progress work being safely discarded because the render phase is pure (no side effects), and the distinction between concurrent (interleaved on one thread) vs parallel (simultaneous on multiple threads).