SitemapScan
Default Rule
Default-rule pages show the simplest robots.txt posture: one wildcard line standing in for every crawler. This often means the site is not segmenting access policy by bot family. This subgroup page is tied to the current 7 days snapshot and is meant to be read as a structured robots.txt signal page, not as raw crawler traffic logs.
Snapshot window: 7 days.
What to study on this page
This subgroup page is useful when you want to understand how default rule appear in declared robots.txt policy, how that differs from nearby bot families, and how the pattern changes across archive windows.
Why the 7 days window matters
The 7-day window is useful when you want the freshest visible robot-family declarations in the public archive.
Related archive paths
- Default Rule 7 days — view the freshest short-window snapshot for this family.
- Default Rule 30 days — view the broader month-scale snapshot for this family.
- Default Rule all time — view the long-tail historical snapshot for this family.
What this crawler family means
Sites that only declare a wildcard default rule in robots.txt.
Related families
- Search Crawlers — Search-engine crawlers mentioned in robots.txt, including Googlebot and similar agents.
- AI Crawlers — AI crawlers such as GPTBot, Claude, and related model-facing agents.
- Other Agents — Agents that still fall outside the current robots.txt crawler taxonomy.
FAQ
What does default rule mean in robots.txt?
Sites that only declare a wildcard default rule in robots.txt. In SitemapScan, this family groups recent public checks where those user-agent declarations were explicitly present in robots.txt.
Why can default rule matter for SEO or crawling policy?
Because a robots.txt declaration tells you which bot families site owners are thinking about. That can reveal how they manage discovery, syndication, AI access, monitoring, or platform integrations in the 7 days window.
Does this page show live traffic from default rule?
No. It shows mentions of user-agent lines declared in robots.txt across recent public checks, not bot request logs or crawl volume from server access logs.