Selected answers from the Dumb SEO Questions G+ community.
Lyndon NA: Are you asking about 1) "time between requests" or 2) "time after request and before crawling the content of that request" ?If you`d Googled the question (or just "robots txt crawl delay"), you should have seen explanations that it`s in Seconds, and that it is the time "between" crawl requests.If you are after a load-delayer, there isn`t one (you cannot tell bots to request, then wait whilst the page loads up data etc.)
Arun Jamwal: Thanks for your reply. Actually I got confused in seconds and milliseconds after seeing this article: https://moz.com/learn/seo/robotstxt
Lyndon NA: If I remember correctly, G never used to pay attention to that directive/request.You had to go through GWMT to set any crawl delays (and they act more like strong requests than commands).As for S vs MS ... It`s Moz, I wouldn`t hold my breath on it`s accuracy. I`m pretty sure it was Seconds when the CD request was introduced.But ... I get the feeling this is a symptom issue, and not a causal one.So ... lets dig a little deeper...Why do you want to delay crawls?
Arun Jamwal: Actually our site load time is little high. Around 8 secs. That`s why I thought to add delay for crawler
Lyndon NA: Right.So it`s a slow-site/resource consumption issue.Rather than trying to patch a plaster over the top, you need to resolve the root problem.So - do you know what contributes to the load times?1) Poor connection times2) Slow server responses3) Slow downs due to DB/Server Side Scripts4) Slow downs due to additional html resources (CSS/JS/Media etc.)5) Rendering issues (lack of dimensions, numerous DoM changes etc.)?????