Adding a crawl delay to robots.txt – M3Server Support


Various search engines and bots will scan your website pages to index them.

This can consume server resources quickly if you have lots of pages.

We recommend adding a crawl to help throttle those bots.


User-agent: SeznamBot
User-agent: DotBot
User-agent: bingbot
User-agent: YandexBot
Crawl-Delay: 30

Keywords: bots, crawling, indexing

Click Here for how to create robots.txt file.

We will be happy to hear your thoughts

Leave a reply

Som2ny Network
Logo
Compare items
  • Total (0)
Compare
0