Robots and sitemap (Next.js-native)
This tutorial shows how to add robots.txt and sitemap.xml using Next.js App Router special files robots.ts and sitemap.ts. No extra package is required; Next.js serves them at /robots.txt and /sitemap.xml with correct headers, which helps search engines discover and crawl your pages.
Why robots.txt and sitemap matter for SEO
robots.txt tells crawlers which paths they can request and where your sitemap is. sitemap.xml lists URLs with optional last modified date, change frequency, and priority so crawlers can index your site more efficiently.
A) Create src/app/robots.ts
Export a default function that returns a MetadataRoute.Robots object. Use your site URL from env for the sitemap and host:
import type { MetadataRoute } from "next";
export default function robots(): MetadataRoute.Robots {
const siteUrl = process.env.NEXT_PUBLIC_SITE_URL || "https://ck444.game";
return {
rules: [
{ userAgent: "*", allow: "/" },
],
sitemap: `${siteUrl}/sitemap.xml`,
host: siteUrl,
};
}This allows all user agents to crawl / and points them to your sitemap. You can add more rules to disallow specific paths if needed.
B) Create src/app/sitemap.ts
Export a default function that returns a MetadataRoute.Sitemap array. Each entry has url, lastModified, changeFrequency, and priority:
import type { MetadataRoute } from "next";
export default function sitemap(): MetadataRoute.Sitemap {
const siteUrl = process.env.NEXT_PUBLIC_SITE_URL || "https://ck444.game";
const now = new Date();
return [
{
url: siteUrl,
lastModified: now,
changeFrequency: "daily",
priority: 1,
},
// Add more URLs (e.g. /tutorial, /tutorial/nextjs-setup, ...)
];
}For a multi-page site, build the array from your routes or CMS (e.g. map over tutorial slugs). changeFrequency can be "always", "hourly", "daily", "weekly", "monthly", "yearly".priority is 0–1 (homepage often 1).
Next steps
After robots and sitemap, add performance and security headers in next.config, and optionally PWA manifest and 404 page. Back to all tutorials.