The 32blog.com build started slowing down as the article count grew. Vercel deployments crept past 2 minutes. Running next build locally before checking a change felt like waiting in line.
Bundle size was becoming an issue too. Lighthouse scores were dipping in ways that traced back to JavaScript that didn't need to be there.
This guide covers the practical techniques I used to cut Next.js build times and reduce bundle size. I'll include specific config details and real numbers where I have them.
Start with Bundle Analyzer to Find What's Heavy
Optimization without measurement is guesswork. First, visualize what's making your bundle large.
npm install --save-dev @next/bundle-analyzer
// next.config.ts
import withBundleAnalyzerInit from "@next/bundle-analyzer";
const withBundleAnalyzer = withBundleAnalyzerInit({
enabled: process.env.ANALYZE === "true",
});
const nextConfig = {
// your config here
};
export default withBundleAnalyzer(nextConfig);
# Launch the analyzer
ANALYZE=true npm run build
This opens an interactive treemap in your browser. Larger blocks mean larger bundle contributions. Focus on the biggest offenders first.
Common heavy dependencies you'll find:
moment.js(3MB+. Replace withdate-fnsordayjs)lodash(when tree-shaking isn't working)@mui/material(only import what you use)- Icon libraries (when importing the entire set)
Quick size checks without a full build
For checking individual package sizes before installing, bundlephobia.com is invaluable — paste the package name and it shows install size, gzip size, and tree-shakeable status.
Tree-Shaking: Stop Bundling Code You Don't Use
The most effective way to cut bundle size is to stop including code that never runs.
Switch lodash to named imports
// Bad: bundles all of lodash (70KB+)
import _ from "lodash";
const result = _.groupBy(items, "category");
// Good: only bundles groupBy (a few KB)
import groupBy from "lodash/groupBy";
const result = groupBy(items, "category");
Or use lodash-es which ships as ES modules and tree-shakes naturally.
npm install lodash-es
npm install --save-dev @types/lodash-es
// lodash-es tree-shakes automatically
import { groupBy, sortBy } from "lodash-es";
Migrate from moment.js to date-fns
If you're still using moment.js, migrating to date-fns is one of the highest-ROI changes you can make.
npm uninstall moment
npm install date-fns
// moment.js — ~70KB after bundling
import moment from "moment";
const formatted = moment().format("YYYY/MM/DD");
// date-fns — only bundles what you import (a few KB)
import { format } from "date-fns";
import { enUS } from "date-fns/locale";
const formatted = format(new Date(), "yyyy/MM/dd", { locale: enUS });
Icon library imports
// Bad: imports all icons via wildcard (several MB)
import * as FaIcons from "react-icons/fa";
// Good: named imports — tree-shaking removes unused icons
import { FaGithub, FaTwitter } from "react-icons/fa";
lucide-react supports tree-shaking natively, so named imports work fine:
// lucide-react: named imports are fine
import { Github, Twitter, ExternalLink } from "lucide-react";
Image Optimization with next/image
next/image handles a lot of optimization automatically, but the right configuration multiplies the impact.
Core image config
// next.config.ts
const nextConfig = {
images: {
remotePatterns: [
{
protocol: "https",
hostname: "images.unsplash.com",
},
],
// Prefer AVIF, fall back to WebP
formats: ["image/avif", "image/webp"],
// Responsive image breakpoints
deviceSizes: [640, 750, 828, 1080, 1200, 1920],
},
};
Set priority on LCP images
Images in the viewport at page load should have the priority prop to get preloaded.
// app/page.tsx
import Image from "next/image";
export default function HeroSection() {
return (
<section>
{/* LCP image: use priority to preload */}
<Image
src="/hero.webp"
alt="Hero image"
width={1200}
height={600}
priority
/>
</section>
);
}
// Images below the fold: no priority needed, lazy loading is default
<Image
src="/article-thumbnail.webp"
alt="Article thumbnail"
width={400}
height={300}
/>
Import local images for automatic sizing
import heroImage from "@/public/hero.webp";
import Image from "next/image";
export function Hero() {
// width/height inferred automatically from the imported image
return <Image src={heroImage} alt="Hero" priority />;
}
Turbopack for Faster Local Development
Turbopack became stable in Next.js 15. Switching local development to Turbopack makes a noticeable difference in startup time and HMR speed.
{
"scripts": {
"dev": "next dev --turbopack",
"build": "next build",
"start": "next start"
}
}
Note: next build still uses Webpack by default (Turbopack for builds is experimental). But switching just next dev to Turbopack is enough for a major developer experience improvement.
Common issues when migrating to Turbopack:
# Fix corrupted Turbopack cache
rm -rf .next
npm run dev
// webpack-specific config won't apply when using Turbopack
// next.config.ts — conditional if needed
const nextConfig = {
webpack: (config, { isServer }) => {
// This block won't run under Turbopack
if (!isServer) {
config.resolve.fallback = { fs: false };
}
return config;
},
};
Rendering Strategy and Cache Configuration
Choosing the right rendering approach and being explicit about cache behavior directly affects page load performance.
PPR (Partial Prerendering) — experimental in Next.js 15
// next.config.ts (Next.js 15+)
const nextConfig = {
experimental: {
ppr: true,
},
};
// app/blog/page.tsx
import { Suspense } from "react";
export default function BlogPage() {
return (
<main>
{/* Static parts render instantly */}
<h1>Articles</h1>
<StaticArticleList />
{/* Dynamic parts stream in */}
<Suspense fallback={<div>Loading...</div>}>
<DynamicUserRecommendations />
</Suspense>
</main>
);
}
Be explicit about fetch cache behavior
// lib/api.ts
// Slow-changing data (revalidate: 3600 = 1 hour)
export async function getCategories() {
const res = await fetch("https://api.example.com/categories", {
next: { revalidate: 3600, tags: ["categories"] },
});
return res.json();
}
// Frequently-changing data (revalidate: 60 = 1 minute)
export async function getLatestPosts() {
const res = await fetch("https://api.example.com/posts?limit=10", {
next: { revalidate: 60, tags: ["posts"] },
});
return res.json();
}
// Real-time data (no cache)
export async function getLivePrice() {
const res = await fetch("https://api.example.com/price", {
cache: "no-store",
});
return res.json();
}
Pre-generate dynamic routes at build time
// app/blog/[slug]/page.tsx
export async function generateStaticParams() {
const posts = await getAllPosts();
return posts.map((post) => ({
slug: post.slug,
}));
}
// Allow on-demand generation for uncached paths
export const dynamicParams = true;
export default async function BlogPost({
params,
}: {
params: Promise<{ slug: string }>;
}) {
const { slug } = await params;
const post = await getPost(slug);
return <article>{/* article content */}</article>;
}
Shorten Build Times with Caching
Separate type checking from the build
next build includes type checking by default. In CI, splitting them enables parallel execution.
{
"scripts": {
"build": "next build",
"type-check": "tsc --noEmit",
"ci": "npm run type-check && npm run build"
}
}
Run them as parallel GitHub Actions jobs:
# .github/workflows/ci.yml
jobs:
type-check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "20"
cache: "npm"
- run: npm ci
- run: npm run type-check
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "20"
cache: "npm"
- run: npm ci
- run: npm run build
Cache the Next.js build in CI
# .github/workflows/ci.yml
- name: Cache Next.js build
uses: actions/cache@v4
with:
path: |
.next/cache
key: ${{ runner.os }}-nextjs-${{ hashFiles('package-lock.json') }}-v1
restore-keys: |
${{ runner.os }}-nextjs-${{ hashFiles('package-lock.json') }}-
Caching .next/cache in CI dramatically reduces build time when only a few files changed. On 32blog.com, this cut CI build time from 2 minutes 30 seconds to around 50 seconds.
Wrapping Up
Here's a summary of Next.js build optimizations ranked by impact-to-effort ratio:
| Technique | Impact | Effort |
|---|---|---|
| Bundle analyzer (measure first) | Visibility | Low |
| moment.js → date-fns | -70KB+ bundle | Medium |
| lodash named imports | -50KB+ bundle | Low |
| next/image priority on LCP | Better LCP | Low |
| Turbopack (dev only) | -70% HMR time | Low |
| Explicit fetch revalidate | Fewer unnecessary SSR calls | Medium |
| CI build cache | -60% CI build time | Medium |
| generateStaticParams | Better TTFB | Medium |
Never optimize without measuring first. Start with the bundle analyzer, find the biggest items, and fix them in order of size. Low-effort, high-impact wins first — Turbopack and CI caching are good starting points since they're quick to set up and the improvement is immediately visible.