Adding a Dynamic Sitemap to a Next.js 13 App

Posted in category Next Js on
905 Words ~5 Minute Reading Time • Subscribe to receive updates on Next Js
Eric David Smith
Software Engineer / Musician / Entrepreneur
Adding a Dynamic Sitemap to a Next.js 13 App - Eric David Smith

Learn how to add a dynamic sitemap (including MDX files) to a Next.js 13 app in this step-by-step tutorial.

When it comes to SEO, having an xml sitemap is a must. A sitemap is a file that lists all the pages on your website. It helps search engines like Google and Bing crawl your website more efficiently.

In this tutorial, we'll learn how to add a dynamic sitemap to a Next.js 13 app.

Next.js 13 introduced a new Metadata API that allows us to create optimized SEO. We'll use this API to create a dynamic sitemap.

MDX Files

In my case, I have a blog with a lot of posts written in Markdown - mdx local files. I don't want to manually add each post to the sitemap. I want to generate the sitemap dynamically based on the files in my blog directory.

Prerequisites

Getting Started

Let's start by creating a new Next.js app.

npx create-next-app nextjs-sitemap

Next, let's create a new file inside the root of our project called app/sitemap.ts. This is where we'll create our dynamic sitemap.

cd nextjs-sitemap/app
touch sitemap.ts

Inside app/sitemap.ts, let's add the following code:

import { MetadataRoute } from "next";
import fs from "fs";
import path from "path";

export default function sitemap(): MetadataRoute.Sitemap {
  // Path to the directory containing your MDX files
  const blogDirectory = path.join(process.cwd(), "blog"); // your blog directory maybe different

  // Retrieve all MDX file paths recursively
  const mdxFilePaths = getAllMdxFilePaths(blogDirectory);

  // Generate URLs and add them to the sitemap
  const sitemap = mdxFilePaths.map((filePath) => {
    const slug = path.basename(filePath, ".mdx"); // remove the .mdx extension from the file name to get the slug
    const category = path.basename(path.dirname(filePath));
    const url = `https://ericdavidsmith.com/blog/${category}/${slug}`;
    const lastModified = fs.statSync(filePath).mtime;
    return {
      url,
      lastModified,
    };
  });

  // Add other URLs to the sitemap
  sitemap.push(
    {
      url: "https://ericdavidsmith.com",
      lastModified: new Date(),
    }
    // Add other URLs here
  );

  return sitemap;
}

// Recursively retrieve all MDX file paths
function getAllMdxFilePaths(directory: string): string[] {
  const fileNames = fs.readdirSync(directory);
  const filePaths = fileNames.map((fileName) => {
    const filePath = path.join(directory, fileName);
    const stat = fs.statSync(filePath);
    if (stat.isDirectory()) {
      return getAllMdxFilePaths(filePath);
    } else {
      return filePath;
    }
  });

  return Array.prototype.concat(...filePaths);
}

Let's break down what's happening here.

Now that we have our sitemap function, let's add it to our next.config.js file.

Test it Out

Let's test out our sitemap by running the following command:

npm run dev

Then, open your browser and navigate to http://localhost:3000/sitemap.xml. You should see something like this:

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://ericdavidsmith.com/blog/about/uncovering-the-life-of-eric-david-smith</loc>
<lastmod>2023-07-10T14:22:10.081Z</lastmod>
</url>
<url>
<loc>https://ericdavidsmith.com/blog/artificial-intelligence/awesome-ai</loc>
<lastmod>2023-07-10T14:58:45.568Z</lastmod>
</url>
</urlset>

Adding the Sitemap to the Robots.txt File

Now that we have our sitemap, let's add it to our robots.txt file.

In Next 13, we can generate a robots.txt file by adding a robots.ts file to our app directory. Let's create a new file called app/robots.ts and add the following code:

import { MetadataRoute } from "next";

export default function robots(): MetadataRoute.Robots {
  return {
    rules: {
      userAgent: "*",
      allow: "/",
    },
    sitemap: "https://ericdavidsmith.com/sitemap.xml",
  };
}

Let's break down what's happening here.

Test it Out

Let's test out our robots.txt file by running the following command:

npm run dev

Then, open your browser and navigate to http://localhost:3000/robots.txt. You should see something like this:

User-agent: *
Allow: /
Sitemap: https://ericdavidsmith.com/sitemap.xml

Now What?

Now that we have our sitemap and robots.txt file, we can submit them to Google Search Console and all of your SEO dreams will come true! 🤣

Supporting My Work

Please consider Buying Me A Coffee. I work hard to bring you my best content and any support would be greatly appreciated. Thank you for your support!

Contact


Eric David Smith
Software Engineer / Musician / Entrepreneur

Blog Post Tags