SitemapScan

Commerce and Marketplace Bots

Commerce pages show where site owners call out shopping and marketplace agents explicitly. These bots often indicate feed-like, catalog-like, or transactional discovery patterns outside pure web search. This subgroup page is tied to the current 7 days snapshot and is meant to be read as a structured robots.txt signal page, not as raw crawler traffic logs.

Snapshot window: 7 days.

What to study on this page

This subgroup page is useful when you want to understand how commerce and marketplace bots appear in declared robots.txt policy, how that differs from nearby bot families, and how the pattern changes across archive windows.

Why the 7 days window matters

The 7-day window is useful when you want the freshest visible robot-family declarations in the public archive.

Related archive paths

What this crawler family means

Commerce and marketplace-related bots such as Amazon and shopping agents.

Related families

FAQ

What does commerce and marketplace bots mean in robots.txt?

Commerce and marketplace-related bots such as Amazon and shopping agents. In SitemapScan, this family groups recent public checks where those user-agent declarations were explicitly present in robots.txt.

Why can commerce and marketplace bots matter for SEO or crawling policy?

Because a robots.txt declaration tells you which bot families site owners are thinking about. That can reveal how they manage discovery, syndication, AI access, monitoring, or platform integrations in the 7 days window.

Does this page show live traffic from commerce and marketplace bots?

No. It shows mentions of user-agent lines declared in robots.txt across recent public checks, not bot request logs or crawl volume from server access logs.

Open the live interactive Robots Signals view