Christopher Shin: Yes. I feel the same way but my client’s web developer feels otherwise. They read. Disallow: /css , /js etc. I usually minify the css and js files but blocking is something I have yet to encounter so really wanted to ask other pros in this group. Thanks!
Rob Woods: nope, in fact a while back Google recommended overtly allowing them to be crawled. http://www.thesempost.com/how-to-unblock-all-css.../
Vishal Thakkar: People may have different reasons to disallow this crawl. Why not set up your images/js/css in two folders, 1) you allow bots to crawl and other 2) you don`t want them to crawl .. p.s. As others have said, Google does prefer that it is allowed to crawl.
Tony McCreath: Trey Collier You don`t want Googlebot or rendering crawlers going around triggering hits. Google Analytics disallows its code.
Trey Collier: So you want to block the tracking code for GA to any Google bot that visits? Aren`t the G Bots and GA smart enough to know without manually trying to block that code?
Tony McCreath: Trey Collier you can`t block resources on domains you don`t control. They do that. I`m sure the crawling team told the GA team to do it.
Michael Stricker: Better to filter spam bots in GA using Google’s own list. Resources, however, must be available to Gbot so that it can use JS and render the page and all of its functionality, as part of its quality/UX determination. To disallow this is to cede rank to pages of domains that will comply.
Tony McCreath: Michael Stricker that`s a different issue. GA disallowing all crawlers is a request and mainly for Google`s own bot. Spambots will not honour robots.txt and in most cases directly call the API and never visit the website for real. The GA robots.txt is to stop nice bots that render from causing fake hits. Unfortunately Google`s reporting system often causes false issues that cause panic. If what is disallowed does not cause rendering issues, there is no problem.
Michael Stricker: Tony McCreath I get that it’s different, but point is, it’s easier to filter out the bot page loads after the fact. robots is but a suggestion, even to Gbot. Better to err on the side of allow, and rely on pw-protected directories for the truly-sensitive stuff.
Ryan Jones: I try not to block anything whenever possible. The engines are usually smart enough to figure it out.