Selected answers from the Dumb SEO Questions Facebook & G+ community.
Gregory Dantschotter: If you`re facing some server problems or have too much traffic. Extra crawlers like bots, screaming frog, ... Can cause extra load. By setting a crawl delay you limit the load of bots.
Casey Markee: Google does not support the "crawl delay" in the robots.txt file and it is ignored completely.
This is well-known fact. Here`s one of dozens of threads that cover this issue in detail (all readily available in Google):
https://www.drupal.org/node/2492191
Total waste of time. Just remove it.
Tim Capper: NONE because GBot does not follow. They have their own built in crawl delay based upon what they feel your site can handle.
Dan Thies: And you get a wee bit of control in GSC, but crawl delay is kind of an anachronism at this point.
If your server can`t handle the speed of the bots it`s definitely not doing it for your visitors.