How to Create a Sitemap for Your Next.js Static Blog

If you have followed our previous articles on building a static blog using Next.js, you may be interested in improving your blog's SEO. One important step in this regard is creating a sitemap or sitemap.xml and a standard file that limits robots' access to your website.

A sitemap is a file that lists all the pages on your website and helps search engines understand your website's structure.

By providing this file, you can ensure that all the pages on your website are discovered by search engines and listed in their search results, which can improve the visibility and ranking of your website in search results.

On the other hand, the standard file that limits robots' access to your website or robots.txt tells search engines which pages or sections of your website should not be crawled or included in their search results.

This file is important for controlling how search engines access and index your website. By using this file, you can prevent sensitive pages such as login pages, admin pages, or pages containing personal information from being crawled.

You can also use this file to specify which search engines are allowed to crawl your website and which pages should be excluded from the index list.

In this article, we will use the next-sitemap plugin to create the sitemap.xml and robots.txt files for your Next.js static blog. So follow the steps below.

First, install the next-sitemap package using the command below.

pnpm add next-sitemap

Then create the next-sitemap.config.js file in the root directory of your Next.js project. In the file you created, add the following code.

/** @type {import('next-sitemap').IConfig} */
module.exports = {
  siteUrl: 'https://your-website-url.com',
  generateRobotsTxt: true,
}

Please make sure you have replaced your-website-url.com with your website's address. The generateRobotsTxt option has been enabled to generate the robots.txt file along with the sitemap.xml.

Next, add the following code to your next.config.js file. This code is used to add a trailing slash (/) to the sitemap URLs and prevent duplicate URLs.

module.exports = {
  trailingSlash: true,
}

Next, update your package.json file with the following scripts.

{
  "scripts": {
    "dev": "next dev",
    "build": "next build",
    "postbuild": "npm run export",
    "lint": "next lint",
    "preexport": "next-sitemap",
    "export": "next export"
  }
}

This script generates files during the build command and then exports static pages to the out directory.

Run the following command to build your Next.js project and generate the static files of your project:

pnpm run build

This action creates sitemap.xml and robots.txt files in the public directory of your project.

To ignore the generated files by Git in your project, add the following lines to your .gitignore file to exclude them.

**/robots.txt
**/sitemap*.xml

These are all the simple steps to create robots.txt and sitemap.xml files for your Next.js static blog that will help search engines crawl and index your website more efficiently.

By following these simple steps, you have created a sitemap and a robots.txt file for your Next.js static blog that will assist search engines in crawling and indexing your website more effectively.