robots.txt tells search engine bots which pages or folders to crawl — and which to avoid.
Incorrect robots.txt settings can block search engines from crawling and indexing your site, resulting in invisible pages and lost traffic.
User-agent: *
Disallow: /admin/
Disallow: /private/
Sitemap: https://example.com/sitemap.xml
Avoid blocking important URLs or your sitemap.xml file. Use Disallow only when absolutely necessary, and always test changes in Google Search Console.