Hi, I have gotten this same message from Webmasters` Tools twice in the last 2 months for my company`s site:
Over the last 24 hours, Googlebot encountered 18 errors while attempting to access your robots.txt. To ensure that we didn`t crawl any pages listed in that file, we postponed our crawl. Your site`s overall robots.txt error rate is 100.0%
So the last time I got it, I spoke to the web developer of the site, and he says he didn`t even create the site with a robots.txt file in the first place.
Can anyone help me with this? Thanks. :)