TrackMySitemap Logo
TrackMySitemap
Home
How it works
Pricing
Blog
Contacts
TrackMySitemap Logo
TrackMySitemap

TrackMySitemap

Find SEO Issues Before Google Does.

Quick Links

  • Home
  • How it works
  • Pricing
  • Blog
  • Contacts

Guides

  • sitemap.xml Explained
  • robots.txt Basics
  • Status Codes
  • robots.txt Blocking
  • Canonical Tags
  • Canonical URL Doesn’t Match
  • Title Tags
  • Meta Description
  • Noindex Meta Tags
  • Page Flagged Noindex
  • Duplicates: Title, Description, Canonical
  • Multiple H1 Tags
  • Missing Favicon
  • Title Tag Too Long
  • Meta Description Too Long
  • Images Without Alt Tags
  • Missing Mobile Viewport Meta Tag
  • User Zoom Blocked in Viewport
  • Missing HTML Lang Attribute
  • Missing Open Graph Title
  • Missing Open Graph Description
  • Missing Open Graph Image
  • Insecure Resources (HTTP)
  • Excessive Inline CSS
  • Excessive Inline JavaScript
  • Blocking JavaScript
  • Large HTML Page
  • Iframe Without Title
  • Excessive Iframe Usage
  • Missing Doctype Declaration

Legal Infromation

  • Terms of Service
  • Privacy Policy
  • Cookie Policy
  • Refund Policy

Social Media

Featured on Startup FameHuzzler Embed BadgeSEO Firewall - TrackMySitemap badge
© 2025 TrackMySitemap. All rights reserved.
← Back to SEO Guide

robots.txt Basics

robots.txt tells search engine bots which pages or folders to crawl — and which to avoid.

Impact on SEO

Incorrect robots.txt settings can block search engines from crawling and indexing your site, resulting in invisible pages and lost traffic.

Example

User-agent: *
Disallow: /admin/
Disallow: /private/

Sitemap: https://example.com/sitemap.xml

How to Fix

Avoid blocking important URLs or your sitemap.xml file. Use Disallow only when absolutely necessary, and always test changes in Google Search Console.

Not Sure What’s Hurting Your SEO?

Scan your site now and find out in seconds.

Run Free SEO Scan