
Lazy Loading and Code Splitting: Cut Initial Load Time in Half
2MB bundle meant 5-second initial load. Lazy loading and code splitting load only what's needed, cutting it to 2 seconds.

2MB bundle meant 5-second initial load. Lazy loading and code splitting load only what's needed, cutting it to 2 seconds.
Optimizing by gut feeling made my app slower. Learn to use Performance profiler to find real bottlenecks and fix what matters.

Text to Binary (HTTP/2), TCP to UDP (HTTP/3). From single-file queueing to parallel processing. Google's QUIC protocol story.

From HTML parsing to DOM, CSSOM, Render Tree, Layout, Paint, and Composite. Mastering the Critical Rendering Path (CRP), Preload Scanner, Reflow vs Repaint, and requestAnimationFrame.

Obsessively wrapping everything in `useMemo`? It might be hurting your performance. Learn the hidden costs of memoization and when to actually use it.

As my project grew, I kept adding features one by one. Chart library, rich text editor, PDF viewer, image editor. All necessary features, so naturally I added them. Then one day, a user said, "Your site is too slow."
I opened the developer tools and was shocked. The main bundle file was 2.3MB. Initial load took 5 seconds. Users had to wait 5 seconds just to see the first screen. Even on fast internet.
The problem was clear. When users landed on the dashboard, they didn't need the PDF viewer code. If they weren't viewing the admin page, there was no reason to download admin code. But everything was packed into a single bundle. It was like carrying every item in your house in a bag just to go to the corner store.
I applied lazy loading and code splitting. Changed it to load only the necessary code at the necessary time. The result was dramatic. Initial load dropped to 2 seconds. More than half reduced. This post documents what I understood through that process.
The first realization that hit me was this: You don't need to fetch all the code upfront. Just like Netflix streams movies instead of downloading them entirely, code can be fetched in pieces as needed.
The first application was route-based splitting. I was using Next.js, which already automatically splits bundles by page. But I was importing all pages in the main layout.
Before: Loading all pages at once// app/layout.tsx - Wrong approach
import Dashboard from './dashboard/page';
import AdminPanel from './admin/page';
import ReportViewer from './reports/page';
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html>
<body>{children}</body>
</html>
);
}
I realized the problem. I was breaking Next.js's automatic code splitting. Instead of directly importing each page, I should let Next.js routing handle it.
After: Letting Next.js routing handle it// app/dashboard/page.tsx - Each page independent
export default function Dashboard() {
return <div>Dashboard content</div>;
}
// app/admin/page.tsx - Split into separate bundle
export default function AdminPanel() {
return <div>Admin panel</div>;
}
This alone reduced the initial bundle from 2.3MB to 800KB. Each page loaded only when needed.
Next was components. The dashboard had a large chart library. But 80% of users never opened the chart tab. Yet the chart library (600KB) was always downloaded when the dashboard loaded.
I used React.lazy(). Changed it to load the chart only when needed.
Before: Always loading the chartimport { Chart } from 'recharts';
function Dashboard() {
const [showChart, setShowChart] = useState(false);
return (
<div>
<button onClick={() => setShowChart(true)}>Show Chart</button>
{showChart && <Chart data={data} />}
</div>
);
}
This code imports Chart even when showChart is false. It downloads 600KB even if the user never clicks the button.
After: Loading only when neededimport { lazy, Suspense, useState } from 'react';
const Chart = lazy(() => import('recharts').then(module => ({
default: module.Chart
})));
function Dashboard() {
const [showChart, setShowChart] = useState(false);
return (
<div>
<button onClick={() => setShowChart(true)}>Show Chart</button>
{showChart && (
<Suspense fallback={<div>Loading chart...</div>}>
<Chart data={data} />
</Suspense>
)}
</div>
);
}
The chart library downloaded only when the button was clicked. Most users saved 600KB.
The more powerful pattern was dynamic import. Loading code only under specific conditions.
// Load admin tools only for admins
async function loadAdminTools() {
if (user.role === 'admin') {
const { AdminTools } = await import('./AdminTools');
return AdminTools;
}
return null;
}
// Load PDF viewer only for PDF files
async function openFile(file: File) {
if (file.type === 'application/pdf') {
const { PDFViewer } = await import('./PDFViewer');
return <PDFViewer file={file} />;
}
return <DefaultViewer file={file} />;
}
This pattern was a game changer. Non-admin users never download admin code. If you don't open a PDF, you don't download the PDF viewer.
Initially, I got excited and made everything lazy. Even small components. The result backfired.
// Bad example: Split too granularly
const Button = lazy(() => import('./Button'));
const Icon = lazy(() => import('./Icon'));
const Text = lazy(() => import('./Text'));
The problem was waterfall requests. Loading Button, then Icon, then Text created sequential requests. Downloading 3 small files sequentially was actually slower.
I established principles:// Improved: Related components in one chunk
// components/ui/index.ts
export { Button } from './Button';
export { Icon } from './Icon';
export { Text } from './Text';
// Usage
const UIComponents = lazy(() => import('./components/ui'));
I judged which libraries were large purely by guessing. Big mistake. When I actually measured, reality differed from expectations.
I installed webpack-bundle-analyzer.
npm install --save-dev webpack-bundle-analyzer
// next.config.js
const withBundleAnalyzer = require('@next/bundle-analyzer')({
enabled: process.env.ANALYZE === 'true',
});
module.exports = withBundleAnalyzer({
// existing config
});
ANALYZE=true npm run build
The visualized results shocked me. The "large" libraries I thought were actually small, while moment.js that I casually added was taking up 200KB. I replaced it with date-fns and imported only needed functions, cutting it to 10KB.
Lesson: Optimization without measurement is gambling.Lazy loading had a downside. When users clicked a button, the download started then. On slow networks, users saw loading spinners.
The solution was prefetching. Start downloading before the user clicks.
import { useEffect } from 'react';
function Dashboard() {
const [showChart, setShowChart] = useState(false);
// Prefetch 3 seconds after component mount
useEffect(() => {
const timer = setTimeout(() => {
import('recharts'); // prefetch
}, 3000);
return () => clearTimeout(timer);
}, []);
return (
<div>
<button onClick={() => setShowChart(true)}>Show Chart</button>
{showChart && (
<Suspense fallback={<div>Loading...</div>}>
<LazyChart />
</Suspense>
)}
</div>
);
}
The smarter approach was hover prefetch.
function ChartButton({ onClick }: { onClick: () => void }) {
const [prefetched, setPrefetched] = useState(false);
const handleMouseEnter = () => {
if (!prefetched) {
import('recharts'); // Prefetch on hover
setPrefetched(true);
}
};
return (
<button
onClick={onClick}
onMouseEnter={handleMouseEnter}
>
Show Chart
</button>
);
}
The download starts the moment users hover over the button. It takes an average of 200-300ms until click, during which most of the download completes. Users see the chart instantly.
It wasn't just code that needed lazy loading. Images too.
The blog list page had 30 thumbnails. All images loading at once slowed initial load. Users only see 3-4 initially. The rest require scrolling.
HTML's native lazy loading:<img
src="/images/thumbnail.jpg"
alt="thumbnail"
loading="lazy"
width="300"
height="200"
/>
This alone delayed image loading until they approached the viewport. Simple and effective.
For finer control, I used Intersection Observer.
import { useEffect, useRef, useState } from 'react';
function LazyImage({ src, alt }: { src: string; alt: string }) {
const [isLoaded, setIsLoaded] = useState(false);
const imgRef = useRef<HTMLImageElement>(null);
useEffect(() => {
const observer = new IntersectionObserver(
(entries) => {
entries.forEach(entry => {
if (entry.isIntersecting) {
setIsLoaded(true);
observer.disconnect();
}
});
},
{ rootMargin: '100px' } // Preload 100px early
);
if (imgRef.current) {
observer.observe(imgRef.current);
}
return () => observer.disconnect();
}, []);
return (
<img
ref={imgRef}
src={isLoaded ? src : '/images/placeholder.jpg'}
alt={alt}
/>
);
}
In Next.js, certain components shouldn't render on the server. The map component using browser APIs was one.
import dynamic from 'next/dynamic';
const Map = dynamic(() => import('./Map'), {
ssr: false, // Only render on client
loading: () => <div>Loading map...</div>
});
export default function LocationPage() {
return (
<div>
<h1>Location</h1>
<Map />
</div>
);
}
ssr: false skips the component on the server. It only loads on the client. Essential for libraries using window or document.
Google Analytics, chat widgets, ad scripts. These third-party scripts hindered initial loading.
// app/layout.tsx
'use client';
import { useEffect } from 'react';
export default function RootLayout({ children }: { children: React.ReactNode }) {
useEffect(() => {
// Load analytics 3 seconds after page fully loads
const timer = setTimeout(() => {
const script = document.createElement('script');
script.src = 'https://www.googletagmanager.com/gtag/js?id=GA_ID';
script.async = true;
document.body.appendChild(script);
}, 3000);
return () => clearTimeout(timer);
}, []);
return (
<html>
<body>{children}</body>
</html>
);
}
Users see content first, analytics tools load later. Priorities became clear.
I measured before and after optimization with Lighthouse.
Before:Real user metrics also improved. Bounce rate dropped from 35% to 18%. Time on page per pageview increased from 1.2 minutes to 2.4 minutes. Fast loading directly translated to user experience.
The core of lazy loading and code splitting was simple: Fetch only what's needed, when it's needed.
It's like not piling all the food on your plate at once at a buffet. You get what you want to eat as you go. Code works the same way.
Practical guide:
Initial load dropped from 5 seconds to 2 seconds. Users stayed longer, bounce rate halved. It wasn't complex technology. It was a shift in perspective. From "prepare everything upfront" to "only what's needed, when it's needed."