
Learn how to add a dynamic sitemap (including MDX files) to a Next.js 13 app in this step-by-step tutorial.
When it comes to SEO, having an xml sitemap is a must. A sitemap is a file that lists all the pages on your website. It helps search engines like Google and Bing crawl your website more efficiently.
In this tutorial, we'll learn how to add a dynamic sitemap to a Next.js 13 app.
Next.js 13 introduced a new Metadata API that allows us to create optimized SEO. We'll use this API to create a dynamic sitemap.
In my case, I have a blog with a lot of posts written in Markdown - mdx local files. I don't want to manually add each post to the sitemap. I want to generate the sitemap dynamically based on the files in my blog directory.
Let's start by creating a new Next.js app.
npx create-next-app nextjs-sitemap
Next, let's create a new file inside the root of our project called app/sitemap.ts. This is where we'll create our dynamic sitemap.
cd nextjs-sitemap/app
touch sitemap.ts
Inside app/sitemap.ts, let's add the following code:
import { MetadataRoute } from "next";
import fs from "fs";
import path from "path";
export default function sitemap(): MetadataRoute.Sitemap {
  // Path to the directory containing your MDX files
   blogDirectory = path.(process.(), ); 
  
   mdxFilePaths = (blogDirectory);
  
   sitemap = mdxFilePaths.( {
     slug = path.(filePath, ); 
     category = path.(path.(filePath));
     url = ;
     lastModified = fs.(filePath).;
     {
      url,
      lastModified,
    };
  });
  
  sitemap.(
    {
      : ,
      :  (),
    }
    
  );
   sitemap;
}
 (): [] {
   fileNames = fs.(directory);
   filePaths = fileNames.( {
     filePath = path.(directory, fileName);
     stat = fs.(filePath);
     (stat.()) {
       (filePath);
    }  {
       filePath;
    }
  });
   ..(...filePaths);
}
Let's break down what's happening here.
First, we're importing the MetadataRoute type from Next.js. This type is used to define the shape of the sitemap.
Next, we're importing the fs and path modules from Node.js. We'll use these modules to read the files in our blog directory.
Then, we're exporting a function called sitemap that returns a MetadataRoute.Sitemap type.
sitemap function, we're defining a variable called blogDirectory that contains the path to our blog directory.mdxFilePaths that contains an array of all the MDX file paths in our blog directory.mdxFilePaths array and generating a URL for each file. We're also adding the last modified date for each file.sitemap array.Next, we're defining a function called getAllMdxFilePaths that recursively retrieves all the MDX file paths in our blog directory.
Now that we have our sitemap function, let's add it to our next.config.js file.
Let's test out our sitemap by running the following command:
npm run dev
Then, open your browser and navigate to http://localhost:3000/sitemap.xml. You should see something like this:
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://ericdavidsmith.com/blog/about/uncovering-the-life-of-eric-david-smith</loc>
<lastmod>2023-07-10T14:22:10.081Z</lastmod>
</url>
<url>
<loc>https://ericdavidsmith.com/blog/artificial-intelligence/awesome-ai</loc>
<lastmod>2023-07-10T14:58:45.568Z</lastmod>
</url>
</urlset>
Now that we have our sitemap, let's add it to our robots.txt file.
In Next 13, we can generate a robots.txt file by adding a robots.ts file to our app directory. Let's create a new file called app/robots.ts and add the following code:
import { MetadataRoute } from "next";
export default function robots(): MetadataRoute.Robots {
  return {
    rules: {
      userAgent: "*",
      allow: "/",
    },
    sitemap: "https://ericdavidsmith.com/sitemap.xml",
  };
}
Let's break down what's happening here.
First, we're importing the MetadataRoute type from Next.js. This type is used to define the shape of the robots.txt file.
Next, we're exporting a function called robots that returns a MetadataRoute.Robots type.
robots function, we're returning an object with two properties: rules and sitemap.rules property contains an object with two properties: userAgent and allow.userAgent property contains a string with the value of *. This tells search engines that the rules apply to all user agents.allow property contains a string with the value of /. This tells search engines that all pages are allowed.sitemap property contains a string with the value of https://ericdavidsmith.com/sitemap.xml. This tells search engines where to find the sitemap.Let's test out our robots.txt file by running the following command:
npm run dev
Then, open your browser and navigate to http://localhost:3000/robots.txt. You should see something like this:
User-agent: *
Allow: /
Sitemap: https://ericdavidsmith.com/sitemap.xml
Now that we have our sitemap and robots.txt file, we can submit them to Google Search Console and all of your SEO dreams will come true! 🤣
Please consider Buying Me A Coffee. I work hard to bring you my best content and any support would be greatly appreciated. Thank you for your support!