Selected answers from the Dumb SEO Questions G+ community.
Michael Martinez: I believe in keeping as much CSS code in external files as possible. So as close to 1% as possible. Browsers will cache external files. If you load all the CSS into the page then visitors have to download the same code over and over again. Even if you minify the page that is still a lot of unnecessary repetition, especially for mobile users.
Gareth Daine: There’s no set percentage. Though, everything that can be placed in external resources, should be.
Just make sure it’s all concatenated and minified.
Alan Bleiweiss: The balance between separation of presentation layer and content needs to be weighed against speed. Too many sites rely too heavily on CSS and JS files as it is. Where it gets even worse though is the inefficiency of retrieving off-page scripts whether on your own internal servers or off-server at a 3rd party network. Sometimes the easiest way to speed up a site is to pull one or more scripts in from elsewhere and place them on your own server. Sometimes it`s even better to integrate the code in-line. No one size fits all solution exists given the complexity of the web eco-system. And while files can be cached, first time visitors still need to have them retrieved.
David Kutcher: I like what Alan Bleiweiss said, but a bit more. It KILLS me when I see a homepage generated by Wordpress that is optimized, meaning that all the css and js is about 100 extra file connects. Wtf people. You just defeated the dammed purpose.
Alan Bleiweiss: Gareth the weakness in a single request, comes when the single, concantenated file, is itself one megabyte in file size, and it`s on a crappy server to begin with, where the combination of those two results in it taking several seconds just to retrieve. Google often abandons such situations in the crawl/indexation sequence. Which is NOT the fault of Google OR the single request scenario, but instead, as always, bad coding and server admin practices.
Patty Mantaloons: This tool has been effective for me, but looks like they`re having probelms with their SSL Cert https://www.sitelocity.com/critical-path-css-generator
Ammon Johns: The simplest answer is "as little as possible" because while there actually isn`t any such ranking factor as a `code to content ratio`, we know that the speed of a page is a factor, (so leaner is better), and that content can boost your rankings but code cannot. In other words, adding more bytes of content to a page might slightly lower its speed factor, but should increase its relevancy factor. While adding bytes of code (be it HTML, CSS or JS) can only impact speed, and thus be a slight negative.
Where it gets more complex is when you decide whether essential code is better embedded inline to save outside calls, or put on external files keeping the specific pages leaner, and allowing the CSS and JS to be cached, making the site faster for anyone viewing more than a single page. Usability wise, its the external files every time, as this saves any user-agents that can`t use the CSS or JS from having to download them at all.
Finally, you have to consider conversion factors. It is all very well to know that in general faster pages are better, but we do know that more fancy and impressive pages often convert better, get more links, and generally offset whatever that slight speed loss is with better business performance. Ultimately, follow which gives the most results, and don`t be afraid to split-test if needs be.
Doc Sheldon: As Ammon says, such ratios aren`t important. Focus on page speed and you`ll be good, Neil. Choose wisely between in-line and stylesheet CSS, defer loading on JS where practical... and don`t forget to sacrifice the occasional small animal to Cthulhu. ;)