John Mueller originally shared:
An update (March 2016) on the current state & recommendations for JavaScript sites / Progressive Web Apps 1 in Google Search. We occasionally see questions about what JS-based sites can do and still be visible in search, so here`s a brief summary for today`s state:
# Don`t cloak to Googlebot. Use "feature detection" & "progressive enhancement" 2 techniques to make your content available to all users. Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn`t support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.
# Use rel=canonical 3 when serving content from multiple URLs is required.
# Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don`t use a "meta fragment" tag if the "escaped fragment" URL doesn`t serve fully rendered content. 4
# Avoid using "#" in URLs (outside of "#!"). Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.
# Use Search Console`s Fetch and Render tool 5 to test how Googlebot sees your pages. Note that this tool doesn`t support "#!" or "#" URLs.
# Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren`t blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.
# Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.
# Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct "lastmod" dates for signaling changes on your website.
# Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.
Looking at this list, none of these recommendations are completely new & limited to today -- and they`ll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites!
I hope this was useful! Let me know if I missed anything, or if you need clarifications for any part.
Links:
1 PWA: https://developers.google.com/web/progressive-web-apps
2 Progressive enhancement: https://en.wikipedia.org/wiki/Progressive_enhancement
3 rel=canonical: https://support.google.com/webmasters/answer/139066
4 AJAX Crawling scheme: https://developers.google.com/webmasters/ajax-crawling/docs/specification
5 https://support.google.com/webmasters/answer/6066468