Dumb SEO Questions

(Entry was posted by Chase Reiner on this post in the Dumb SEO Questions community on Facebook, 05/25/2017).

What is the point of crawl delay in robots.txt

What is the point of crawl delay in robots.txt
This question begins at 00:38:19 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:38:19
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Gregory Dantschotter: If you`re facing some server problems or have too much traffic. Extra crawlers like bots, screaming frog, ... Can cause extra load. By setting a crawl delay you limit the load of bots.
  • Casey Markee: Google does not support the "crawl delay" in the robots.txt file and it is ignored completely. This is well-known fact. Here`s one of dozens of threads that cover this issue in detail (all readily available in Google): https://www.drupal.org/node/2492191 Total waste of time. Just remove it.
  • David Kutcher: https://lmgtfy.com/
  • Tim Capper: NONE because GBot does not follow. They have their own built in crawl delay based upon what they feel your site can handle.
  • Dan Thies: And you get a wee bit of control in GSC, but crawl delay is kind of an anachronism at this point. If your server can`t handle the speed of the bots it`s definitely not doing it for your visitors.

View original question in the Dumb SEO Questions community on Facebook, 05/25/2017).