
Virtual DOM: The Secret of React's Speed
Why direct DOM manipulation is slow. How Virtual DOM works (Diffing algorithm). React Fiber and incremental rendering explained.

Why direct DOM manipulation is slow. How Virtual DOM works (Diffing algorithm). React Fiber and incremental rendering explained.
Why does my server crash? OS's desperate struggle to manage limited memory. War against Fragmentation.

Two ways to escape a maze. Spread out wide (BFS) or dig deep (DFS)? Who finds the shortest path?

Fast by name. Partitioning around a Pivot. Why is it the standard library choice despite O(N²) worst case?

Establishing TCP connection is expensive. Reuse it for multiple requests.

Back in 2015, I was building a bulletin board with jQuery. To delete one item from a list of 100, I did this:
// Directly find and remove the DOM
$('#item-42').remove();
It was intuitive and fast. But as the data grew to 1,000 items, with filtering features and like buttons, hell broke loose. The data (Array) changed but the UI (DOM) didn't update, or vice versa. Sync bugs were everywhere.
Then React appeared. "Don't touch the DOM directly. Just change the state. The UI will update itself."
I was skeptical. "Won't re-rendering everything be slow?" But React was fast. The secret was Virtual DOM.
In 2017, I introduced React to a product for the first time. It was a customer management dashboard displaying a list of around 500 customers, each row with an "Edit" button. During development, I tested with only 10 dummy records, so everything seemed fine. But after deployment, disaster struck.
Every single keystroke in the search box froze the screen for a second. Users complained, "Is this thing broken?" I panicked. React was supposed to be "fast," so what went wrong?
I opened Chrome DevTools' Performance tab and witnessed something shocking. Every time the search query changed, all 500 list items re-rendered. No matter how fast Virtual DOM is, re-rendering 500 items every keystroke will kill performance.
I traced the issue to my use of index as the key prop.
// 🔴 Wrong code
{customers.map((customer, index) => (
<CustomerRow key={index} data={customer} />
))}
When you use index as the key, React only sees "item 0, item 1, item 2..." even when the data gets filtered. What I should have done was use the unique ID (customer.id) as the key and wrap the component with React.memo.
// ✅ Correct code
const CustomerRow = React.memo(({ data }) => {
return <div>{data.name}</div>;
});
{customers.map(customer => (
<CustomerRow key={customer.id} data={customer} />
))}
After this fix, the typing lag disappeared. That's when I realized Virtual DOM isn't magic. It's just a tool, and using it correctly is the developer's job.
To understand React, you first need to know how browsers render pages. I initially thought, "Just read the HTML file and display it, right?" But it's way more complex.
display: none elements).The key here is Reflow. This process is extremely expensive because it requires recalculating coordinates and sizes of all elements based on CSS properties like position, width, height, and margin.
// 🔴 Bad example: Touching the DOM 100 times
for(let i=0; i<100; i++) {
document.body.innerHTML += `<div>${i}</div>`;
}
This code theoretically triggers Reflow 100 times. (Browsers do optimize this, but it's still slow). It's like reorganizing your bookshelf every time you insert a single book.
This analogy clicked for me. "Ah, it's not that DOM manipulation itself is slow, but triggering Reflow too frequently is the problem." That's why old-school developers used DocumentFragment to do all the work in memory and attach it to the DOM once.
// ✅ Good example: Attach once
const fragment = document.createDocumentFragment();
for(let i=0; i<100; i++) {
const div = document.createElement('div');
div.textContent = i;
fragment.appendChild(div);
}
document.body.appendChild(fragment); // Reflow happens just once
But writing code this way is too cumbersome. React's Virtual DOM automates this "work in memory and attach once" pattern.
Virtual DOM is a fake DOM that exists in memory. It's just a JavaScript object. When I first heard "Virtual DOM," I thought it was some advanced tech. But in reality, it's this simple:
// What Virtual DOM actually is
const vdom = {
type: 'div',
props: {
className: 'container',
children: [
{ type: 'h1', props: { children: 'Hello' } },
{ type: 'p', props: { children: 'World' } }
]
}
};
"That's it?" I thought. But this simple structure brought massive performance gains.
setState(newState).Many people compare this to "Double Buffering" in game development. In games, if you draw directly to the visible screen (Front Buffer), you get flickering. So you draw to an invisible buffer (Back Buffer) first, then swap when it's done. Virtual DOM works the same way.
Analogy: "Instead of printing every edit (Real DOM), you finish all edits in a word processor (Virtual DOM) and hit print once (Batch Update)."
Once I accepted this analogy, Virtual DOM's purpose became crystal clear. "Minimize expensive operations (Reflow)."
Perfectly comparing two trees is normally O(n³). With 1,000 nodes, that's a billion operations. Way too slow. React used bold heuristics to reduce this to O(n).
<div> changes to <span>? Don't compare children. Just destroy and rebuild.key to list children.
At first, I thought, "Isn't this imperfect comparison? Won't it cause bugs?" But in practice, <div> rarely suddenly becomes <span>. React chose "practicality" over "perfection", and it worked.
<!-- Before -->
<ul>
<li>Apple</li>
<li>Banana</li>
</ul>
<!-- After: "Orange" added at the top -->
<ul>
<li>Orange</li>
<li>Apple</li>
<li>Banana</li>
</ul>
Without keys, React compares in order:
3 DOM mutations happen.
<ul>
<li key="orange">Orange</li>
<li key="apple">Apple</li>
<li key="banana">Banana</li>
</ul>
React: "Oh? key="apple" and key="banana" are still there. Just moved positions!"
1 creation (Orange added) and moves only. This difference becomes huge as the list grows.
How did I find that "search box typing lag" issue I mentioned earlier? Thanks to React DevTools' Profiler. Surprisingly, many React developers don't know about this tool. I didn't know about it for a whole year.
In my case, CustomerRow was rendering 500 times, and the reason was "Props changed". But the props didn't actually change. Why?
The parent component was creating a new object every time.
// 🔴 Problem: Creating a new object every render
<CustomerRow data={{ ...customer, timestamp: Date.now() }} />
React compares object references. Even if the content is the same, different addresses mean "different objects". After fixing this, performance normalized.
React.memo memoizes components. If props don't change, it reuses the previous render result. I initially thought, "So I should wrap all components with React.memo?" Big mistake.
<div>Hello</div> is faster to just re-render.After accepting this principle, I stopped "wrapping everything with React.memo". Instead, I use Profiler to find bottlenecks and apply it only where it actually matters.
Up to React 15, it used Stack Reconciler. Once rendering started, it couldn't stop. If comparing a huge tree took longer than 16ms (60fps standard), the screen would jank.
I compared this to "OS Scheduler". OS appears to run multiple processes simultaneously, but actually uses time slicing to switch between them. Fiber works the same way.
Thanks to this, React provides smoother UX, especially noticeable in large lists or complex animations.
Even if Virtual DOM is fast, rendering too frequently still slows things down. That's where useMemo and useCallback come in.
// Parent component
function Parent() {
const [count, setCount] = useState(0);
// 🔴 Problem: This function is treated as "new" every render
const handleClick = () => { console.log("Click"); };
return (
<>
<button onClick={() => setCount(count + 1)}>Count: {count}</button>
{/* Child thinks props changed, causing unnecessary re-render */}
<Child onClick={handleClick} />
</>
);
}
// Child component
const Child = React.memo(({ onClick }) => {
console.log("Child Rendered!"); // ㅠㅠ keeps logging
return <button onClick={onClick}>Child</button>;
});
handleClick is created as a new function every render. In JavaScript, functions are reference types, so () => {} gets a different address each time. So even React.memo-wrapped Child thinks "props changed" and re-renders.
// ✅ Solution: Reuse function if dependencies ([]) don't change
const handleClick = useCallback(() => {
console.log("Click");
}, []);
Now even when Parent re-renders, handleClick maintains the same reference, so Child doesn't re-render.
function ExpensiveComponent({ data }) {
// 🔴 Problem: Recalculates every render even if data didn't change
const result = computeExpensiveValue(data);
return <div>{result}</div>;
}
// ✅ Solution: Recalculate only when data changes
function ExpensiveComponent({ data }) {
const result = useMemo(() => computeExpensiveValue(data), [data]);
return <div>{result}</div>;
}
useMemo caches values, useCallback caches functions. It took me a while to understand this difference. The bottom line: "Prevent unnecessary recalculations/recreations."
One question kept nagging me: "Is Virtual DOM really necessary?" Svelte challenges this assumption entirely.
Svelte is a compiler, not a runtime framework. It analyzes your code at build time and generates optimized vanilla JavaScript. There's no Virtual DOM.
<!-- Svelte component -->
<script>
let count = 0;
</script>
<button on:click={() => count++}>
Count: {count}
</button>
When compiled, this becomes:
// Simplified compiled output
function update(value) {
count = value;
button.textContent = `Count: ${count}`; // Direct DOM update
}
Svelte knows exactly which DOM node to update at compile time. No diffing needed. This can be faster than Virtual DOM in many scenarios.
There's no absolute winner. Svelte wins in bundle size and raw speed. React wins in ecosystem and flexibility. For my projects, I still use React because the ecosystem helps me ship faster.
React 18 introduced Concurrent Rendering, fundamentally changing how React works. It's not just about Virtual DOM anymore.
Before React 18, state updates inside event handlers were batched, but updates in promises/setTimeout weren't.
// React 17: Causes 2 re-renders
setTimeout(() => {
setCount(c => c + 1);
setFlag(f => !f);
}, 1000);
// React 18: Causes 1 re-render (automatic batching)
This was a pain point for years. React 18 fixed it by batching everywhere.
import { useTransition } from 'react';
function SearchPage() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const [isPending, startTransition] = useTransition();
function handleChange(e) {
setQuery(e.target.value); // Urgent: update input immediately
startTransition(() => {
// Non-urgent: filter can wait
setResults(filterResults(e.target.value));
});
}
return (
<>
<input value={query} onChange={handleChange} />
{isPending && <Spinner />}
<Results data={results} />
</>
);
}
The input stays responsive even while filtering thousands of items. React prioritizes the input update and defers the results update.
import { useDeferredValue } from 'react';
function SearchResults({ query }) {
const deferredQuery = useDeferredValue(query);
// This list re-renders with the deferred value
// React can skip intermediate updates if user types fast
return <ExpensiveList query={deferredQuery} />;
}
This is like built-in debouncing. React intelligently skips intermediate renders if the value changes rapidly.
The React team is working on React Compiler (codename: React Forget). The goal? Automatic memoization without useMemo/useCallback.
// We have to manually optimize
const memoizedValue = useMemo(() => computeExpensiveValue(a, b), [a, b]);
const memoizedCallback = useCallback(() => doSomething(a, b), [a, b]);
This is tedious and error-prone. You might forget dependencies or over-memoize.
// You write this
function Component({ a, b }) {
const value = computeExpensiveValue(a, b);
const callback = () => doSomething(a, b);
// ...
}
// Compiler automatically transforms to this
function Component({ a, b }) {
const value = useMemo(() => computeExpensiveValue(a, b), [a, b]);
const callback = useCallback(() => doSomething(a, b), [a, b]);
// ...
}
The compiler analyzes your code and inserts optimizations automatically. This is similar to how Svelte works, but at a different level.
I'm excited about this. It means React could get Svelte-like performance while keeping its declarative API and ecosystem.
If I had to summarize Virtual DOM in one sentence: "A strategy to minimize expensive DOM operations by pre-calculating in memory and applying changes in one batch."
But the landscape is evolving. "Is Virtual DOM truly the best approach?" is a valid question. Svelte proves you don't need it. React Compiler shows React is moving beyond manual optimization.
Technology keeps evolving. But understanding "Why was this technology created?" helps you embrace the next one. Virtual DOM became that reference point for me.
Here's what I learned:
I wasted hours debugging performance issues that could've been avoided with proper keys and memoization. But those painful lessons taught me more than any tutorial ever could. Now when I write React code, I think about the Virtual DOM diff, the reconciliation process, and whether this component truly needs to re-render.
That shift in mindset made me a better developer. And that's the real value of understanding the fundamentals.