Motivation
I needed a blog template, Vercel had the best the ones. I wanted to host it on Github Pages, I could've hosted it on Vercel but idk I didn't want to I guess. Alternatively you can get a static blog template and host it on Github Pages, no issues. But I didn't want to do that either.
Of course, one could just use a blog template designed for a static generator like Jekyll or Hugo and avoid any issues. But I didn't want to do that either. I liked the features and aesthetic of the dynamic template, and I was curious to see if it could be tamed.
This guide is the product of that process. It's a technical walkthrough for any developer who wants to convert a dynamic, server-centric Next.js application into a static site for a host like GitHub Pages. It outlines the necessary steps to re-architect server-dependent features, manage dependencies, and configure a seamless, automated deployment pipeline.
1. The Core Challenge: Dynamic vs. Static Architecture
A dynamic host, such as Vercel, runs a live Node.js server. This allows a Next.js application to perform server-side rendering (SSR), where pages are generated on-demand for each request. It can also run API routes, which are server-side functions that can execute code, query databases, and generate dynamic content like RSS feeds or social media preview images.
A static host, like GitHub Pages, does not run a server. It simply serves pre-built HTML, CSS, and JavaScript files. This makes it incredibly fast, secure, and cost-effective, but it means that any operation requiring a server will fail.
To bridge this gap, we must instruct Next.js to pre-build the entire application into a collection of static files. This is achieved by adding the output: 'export' property to the next.config.js file.
Action: Modify next.config.js to include the output: 'export' property.
/** @type {import('next').NextConfig} */
const nextConfig = {
output: 'export',
}
module.exports = nextConfig
This single line is the catalyst for the entire process. It shifts Next.js's mode of operation from running a server to producing a static /out directory, which enforces a static-only architecture and reveals any server-dependent code through build errors.
2. Re-architecting for a Static Environment
With static export enabled, the build process will now act as our guide, throwing errors for every piece of code that is incompatible with a static environment.
2.1. Eliminating Dynamic API Routes
The first errors were related to dynamic API routes used for generating assets like RSS feeds and sitemaps. These routes are server-side functions and cannot run in a static build.
Action: Delete the dynamic route files.
app/rss/route.tsapp/og/route.tsxapp/sitemap.tsapp/robots.ts
Action: Create their static equivalents in the public directory. Anything in this folder is served as-is. For example, public/robots.txt contained:
User-agent: *
Allow: /
Sitemap: https://ossa-ma.github.io/sitemap.xml
2.2. Migrating the MDX Pipeline to Contentlayer
The next build error was the most difficult. The template used a library, next-mdx-remote, that caused a recurring dependency conflict.
A React Element from an older version of React was rendered
This error indicated a deep version mismatch with React that could not be easily fixed.
Solution: The entire content handling pipeline was migrated to Contentlayer, a modern library that transforms content into type-safe JSON data during the build process.
The migration involved:
- Removing
next-mdx-remote. - Installing
contentlayerandnext-contentlayer. - Creating a
contentlayer.config.tsto define the schema for our posts (title,publishedAt, etc.). - Wrapping
next.config.jswith thewithContentlayerhigher-order component. - Updating all data fetching logic from the old system to the new, much simpler Contentlayer API.
Action: Replace the dependencies.
pnpm remove next-mdx-remote sugar-high
pnpm add contentlayer next-contentlayer rehype-pretty-code shiki -D unified
Action: Create contentlayer.config.ts to define the post schema.
import { defineDocumentType, makeSource } from 'contentlayer/source-files'
export const Post = defineDocumentType(() => ({
name: 'Post',
filePathPattern: `**/*.mdx`,
contentType: 'mdx',
fields: {
title: { type: 'string', required: true },
publishedAt: { type: 'date', required: true },
summary: { type: 'string' },
image: { type: 'string' },
},
computedFields: {
slug: {
type: 'string',
resolve: (doc) => doc._raw.flattenedPath,
},
},
}))
export default makeSource({
contentDirPath: 'app/blog/posts',
documentTypes: [Post],
})
Action: Wrap the next.config.js with the Contentlayer HOC.
const { withContentlayer } = require('next-contentlayer')
/** @type {import('next').NextConfig} */
const nextConfig = {
output: 'export',
}
module.exports = withContentlayer(nextConfig)
Action: Update tsconfig.json to recognize the generated types.
{
"compilerOptions": {
// ...
"paths": {
"contentlayer/generated": ["./.contentlayer/generated"]
}
},
"include": [
// ...
".contentlayer/generated"
]
}
2.3. Client vs. Server Component Architecture
The migration to Contentlayer introduced a new, more subtle architectural error: cannot use both "use client" and export function "generateStaticParams()".
This error highlights a core concept of the Next.js App Router.
- Server Components run exclusively on the server (or at build time). They are ideal for data fetching and can use functions like
generateStaticParamsto tell Next.js which pages to pre-build. - Client Components are interactive and run in the user's browser. They require the
'use client'directive at the top of the file and can use client-side hooks likeuseStateor, in this case,useMDXComponent.
The error occurred because the blog post page was trying to be both at the same time.
- Solution: The page was refactored to separate its client and server concerns. The main page file (
app/blog/[slug]/page.tsx) was kept as a Server Component to handle data fetching andgenerateStaticParams. The part that actually renders the MDX content, which requires the client-sideuseMDXComponenthook, was extracted into its own dedicated Client Component (app/components/mdx-client.tsx).
3. Automating Deployment with GitHub Actions
To streamline the deployment process, a GitHub Actions workflow was created at .github/workflows/deploy.yml. This workflow automates all the steps required to get the site live on every push to the main branch.
The workflow performs the following sequence:
- Checks out the code from the repository.
- Sets up the environment with the correct versions of Node.js and
pnpm. - Installs dependencies using
pnpm install. - Builds the static site using
pnpm run build, which generates the/outdirectory. - Deploys the static files from
/outto a dedicatedgh-pagesbranch using thepeaceiris/actions-gh-pagesaction.
To allow the workflow to push to the gh-pages branch, it must be granted write permissions:
permissions:
contents: write
The permissions block is critical, as it grants the workflow permission to push the built code to the gh-pages branch.
4. Configuring the GitHub Pages Environment
The final step is to tell GitHub Pages where to find the website.
- Solution: In the repository's Settings > Pages, under "Build and deployment", the source was configured to "Deploy from a branch". The branch was set to
gh-pageswith the/ (root)folder.
This tells GitHub Pages to bypass any build process on its own and simply serve the pre-built static files that our GitHub Actions workflow has already placed in the gh-pages branch. This separation of concerns—building in the workflow, serving from the branch—is a robust pattern for static deployments.