Dumb SEO Questions

(Entry was posted by seo palmere on this post in the Dumb SEO Questions community on Facebook, 04/26/2017).

Google bots and seo indexing

Re: google bots and seo indexing

Two parts:
A- by specifying links in my sitemap.xml, does google or others try to index any other pages? Or do they just follow the xml?

B-how does google bots and other indexing bot handle protected folders? just ignored? or what?

Your videos are quite helpful. Thank you for your dedication to the group.

Doc
This question begins at 00:50:42 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:50:42
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Federico Sasso: A (hope I got the question sense right): having a XML sitemap doesn`t prevent a search engine to discover, crawl links, and index other contents within the same site. B: they don`t really care about the concept of "folder", just the single URLs. If the URL is protected with HTTP Authorization, SEs will see a HTTP 401 status code (unauthorized) and know they can`t crawl that single URL. Ditto for HTTP 403 status code (forbidden). The case of form-based protection is normally handled with a temporary redirect (307/302) to a login page and - hopefully - a noindex in the login page robots meta tag; they will in any case unable to crawl the destination URL. There are instances where protected resources could be actually indexed, even if not crawled: like in the of robots.txt protection, the search engine might find repeatedly links to a protected URL to make it consider it a resource worth to be shown in the SERP (normally visible optionally with an additional click).

View original question in the Dumb SEO Questions community on Facebook, 04/26/2017).