Selected answers from the Dumb SEO Questions G+ community.
Francois-Pierre Marcil: It`s more likely to be important for larger websites when the problems are potentially not easily tracked by front-end analytics tools.
Michael Martinez: It`s important for any Website if you want to understand what is actually happening on that site. 3rd-party analytics tools routinely miss a lot of data, some of which is driven by rogue crawlers that don`t trigger the analytics scripts. Rogue crawlers can directly impact Website performance, not just by slowing it down but by tying up limited TCP connections (thus preventing legitimate users and crawlers from accessing the site).
Roger Montti: I agree with Michael. Google Analytics has made people complacent about analytics. Don`t even get me started about the poor UI. But, taking away the keywords, wow, some people new to this business have little idea of what they`re missing. :P
Dave Elliott: Big fan of log file analysis. So many interesting issues to find and it`s great for justifying site architecture changes. Still don`t trust any of the programmes to do it properly. Excel and lots of pivot tables is the way.
Micah Fisher-Kirshner: Generally, the larger the site, the more important it is and more detailed data you need.Rogue crawlers (or even international ones) can impact site loads at critical junctures, but often don`t require detailed log data. But as the site grows in size it becomes critical to know what Google bot is doing across the site.
Dan Thies: If you want to know how your website is interacting with search engines’ spiders, you can’t get the whole picture without logs, unless you’re logging it separately into a dedicated application.