If I have text to code ratio for a category page at 9.78% A recommendation is that I still try to get it to at least 10%. What are the group`s thoughts for a benchmark text to code ratio. It is an eCommerce category page with 24 products per page (paginated too) and a has a paragraph of content relevant to the category of about 150 words. Should I add a bit more content to the paragraph and / or drop the number of products per page. Or take a chill pill?
Selected answers from the Dumb SEO Questions G+ community.
Gerry White: Chill pill ..
It sounds like you’ve got a lot of code, which if you can optimise is a good idea, but don’t worry about content / text ratios ... it’s a metric that can be used to spot issues, not a ranking factor/kpi
Dan Thies: Ignore that number. Take the person who told you it’s relevant to SEO and add them to your Quacks list.
David Harry: Uh yea... while code bloat can lead to some UX issues, it has NOTHING to do with SEO/Google etc.
Roger Montti: There was a spam fighting metric many years ago that used text to code ratio to identify spammers. Spammers tended to have a high text to code ratio, i.e. their HTML was super lean and optimized. Think of those one page squeeze page type sites.
This is something from around 2003-ish.
So this text to code ratio *might* have its origins from that very real way of catching spammers from a long time ago. I can`t recall seeing this ratio in any other context.
Then because of the the online version of the "telephone game" the idea morphed into what you`re talking about now.
With modern CMS`s being what they are, I don`t think a code to text ratio is anything to worry about.
Maria Patterson: Is this true? If your web page contains too much HTML code, that could prevent a search engine crawler from crawling the page in its entirety. If the page contains too much code compared to the amount of text, the search engine crawlers may only capture a portion of the page, leaving out possible vital and relevant content.
Alan Bleiweiss: Maria just so you are aware, Google DOES have a limit on per-page crawl. It`s based on file weight. That is now supposedly around a couple hundred megabytes, though I don`t see how they would allow that much. It used to be 10 megabytes. And I assure you, unless you`re on world class dedicated servers, if your page`s weight is more than a couple megabytes, you`re likely to have crisis level speed problems.
Adam John Humphreys: I have seen 14 second dealership pages in massive cities ranking well. That said 2-3 second load times is fine. I would aim for adjusting it so only code being used gets pushed as someone on Twitter last night illustrated many common sites have upwards of 50% code bloat. These mega themes have caused me a lot of time as of late because while they solve many issues for everyday people they cause more issues for people like myself looking to implement the full solution that doesn`t go 2/3 of the way in almost every aspect.
Once you have your code sorted implement a CDN and insure the SSL is pointed at the appropriate version of your site to avoid redirect chains. Screaming Frog shows how many items face redirects which can slow down a site quite a lot when you have done everything else right. I see this in particular from older WordPress sites with lots of redirect chains
Focus on great value from products for customers in a great architecture where possible hits product in a few clicks. Faceted search is amazing for ecommerce but SEO wise sure takes being implemented properly to the next level as it can cause real messes if not.