Dumb SEO Questions

(Entry was posted by Christopher Shi on this post in the Dumb SEO Questions community on Facebook, 05/10/2018).

How many SEO pros block crawlers from crawling css and js from robots file?

How many SEO pros block crawlers from crawling css and js from robots file? Reasons?
This question begins at 00:12:44 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:12:44
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Becky Westmoreland: None. Google wants to crawl those.
  • Christopher Shin: Yes. I feel the same way but my client’s web developer feels otherwise. They read. Disallow: /css , /js etc. I usually minify the css and js files but blocking is something I have yet to encounter so really wanted to ask other pros in this group. Thanks!
  • Becky Westmoreland: The developer is misinformed- time to educate
  • Micah Fisher-Kirshner: You can probably go into GSC and run a fetch and render to show Google`s complaint about hiding those JS/CSS files
  • Dave Elliott: Yeah, google needs those files these days, your dev is wrong.
  • Yuliana Kronrod: https://developers.google.com/.../mobi.../common-mistakes...
  • Rob Woods: nope, in fact a while back Google recommended overtly allowing them to be crawled. http://www.thesempost.com/how-to-unblock-all-css.../
  • Vishal Thakkar: People may have different reasons to disallow this crawl. Why not set up your images/js/css in two folders, 1) you allow bots to crawl and other 2) you don`t want them to crawl .. p.s. As others have said, Google does prefer that it is allowed to crawl.
  • Tony McCreath: It makes sense to block tracking code.
  • Tony McCreath: Trey Collier You don`t want Googlebot or rendering crawlers going around triggering hits. Google Analytics disallows its code.
  • Trey Collier: So you want to block the tracking code for GA to any Google bot that visits? Aren`t the G Bots and GA smart enough to know without manually trying to block that code?
  • Tony McCreath: Trey Collier you can`t block resources on domains you don`t control. They do that. I`m sure the crawling team told the GA team to do it.
  • Michael Stricker: Better to filter spam bots in GA using Google’s own list. Resources, however, must be available to Gbot so that it can use JS and render the page and all of its functionality, as part of its quality/UX determination. To disallow this is to cede rank to pages of domains that will comply.
  • Tony McCreath: Michael Stricker that`s a different issue. GA disallowing all crawlers is a request and mainly for Google`s own bot. Spambots will not honour robots.txt and in most cases directly call the API and never visit the website for real. The GA robots.txt is to stop nice bots that render from causing fake hits. Unfortunately Google`s reporting system often causes false issues that cause panic. If what is disallowed does not cause rendering issues, there is no problem.
  • Michael Stricker: Tony McCreath I get that it’s different, but point is, it’s easier to filter out the bot page loads after the fact. robots is but a suggestion, even to Gbot. Better to err on the side of allow, and rely on pw-protected directories for the truly-sensitive stuff.
  • Ryan Jones: I try not to block anything whenever possible. The engines are usually smart enough to figure it out.
  • Ryan Jones: But to answer your question: 42
  • Dawn Anderson: I don`t. It`s needed for rendering
  • Dawn Anderson: When you analyse server log files you`ll often see they css and js files are pulled in just immediately after their dependant files / URLs
  • Rob Woods: https://yoast.com/dont-block-css-and-js-files/
  • Marcus Pentzek: To answer your question: SEO Experts don`t ... SEO noobs might 🤔
  • Benj Arriola: Years ago yes. But when Google launched their headless browser... then now need those unblocked.
  • Adam John Humphreys: I`ve always allowed it to be crawled. If you`re talking about sliders indexing etc that`s handled differently.
  • Peter Mead: If you block js and css then the crawlers may not be able to see parts of your site.
  • Benjamin James Barker: Always allow, never block
  • Maria Patterson: https://moz.com/blog/why-all-seos-should-unblock-js-css
  • Bonnie Burns: Never block them. But I do make sure they aren’t chocking or slowing down the site speed and often have them housed in a separate file type to keep things clean

View original question in the Dumb SEO Questions community on Facebook, 05/10/2018).