Use crawl-delay in your robots.txt file to slow down robots
Posted: 14 April 2009 Filed under: system administration | Tags: search, system administration Leave a commentYou can use the “Crawl-delay” tag in your robots.txt file to slow down Web crawlers:
User-agent: *
Crawl-delay: 15
The time is specified in seconds.