Use crawl-delay in your robots.txt file to slow down robots

You can use the “Crawl-delay” tag in your robots.txt file to slow down Web crawlers:
User-agent: *
Crawl-delay: 15

The time is specified in seconds.

Advertisements


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s