Learn how robots.txt can unintentionally block search engines.
Blocked pages will not be crawled, even if they’re listed in your sitemap. This can completely prevent indexing.
User-agent: *
Disallow: /private/
Update your robots.txt file to allow important pages to be crawled. Avoid disallowing entire directories unless needed.