Amir Latif: Nathan Nikolay Gaidy if its GSC then you should check two things first, submitted sitemaps + robots.txt and most likely you will your answer
once that is done you should go into search console and remove urls
Nathan Nikolay Gaidy: Amir Latif So this actually affects SEO and I should only let it crawl the front end content?
Amir Latif: Nathan Nikolay Gaidy Crawlers shouldn`t crawl anything which you don`t want to get indexed, all resource files, themes, plugins, scripts.
so robots.txt helps you to achieve that.
answer to your specific question, does it hurt SEO? Yes, it does, since you are giving wrong and irrelevant data to crawlers where you should only be giving clean URL structure rather then giving them bunch of CSS and JavaScripts. make sense?
Nathan Nikolay Gaidy: Amir Latif But how can a Javascript code be useful to SEO?
You mean if it renders the website itself like react.js?
So the developer has to build a Javascript SEO friendly in the first place and seperate files or Javascript can tell bots not to crawl specific parts of the code?
Nathan Nikolay Gaidy: Brenda Michelin So I should disallow all that for SEO purposes
Can I disallow an entire directory
Or file type?
Ammon Johns: Nathan Nikolay Gaidy google crawl CSS to better understand how a page will actually render, and so properly weight links that are prominently visible on a page, and discount any kind of content that is hard to find or otherwise hidden. The same is true of JavaScript, where it is possible for JavaScripts to do immediate refreshes or redirects, to add or change the links in HTML, and of course, without which information in tabs or otherwise interactive may not be counted.
In particular, Google are looking at how a page will render on a mobile device, now that they have `mobile first` indexing, which in turn is because around the world as a whole, more people use their phones for internet activity than a PC.
Ammon Johns: Nathan Nikolay Gaidy whatever you do, do NOT block access to your CSS and JavaScript, or the pages cannot be properly rendered, and thus will NOT be included in the index.
Nathan Nikolay Gaidy: Ammon Johns So they are not for content reading but more of behavior, quality and user experience?
Ammon Johns: Nathan Nikolay Gaidy yes, `ish` - in that the content reading is part of the assessment, as JavaScript in particular can either add or remove content from the page, and CSS can very easily be used to either make content more or less visible.
Michael Martinez: They may recrawl the same page several times per day. Your raw server log files will show you what the bots are requesting.
Brenda Michelin: Ahh, thank you for that information, I did not know that.