Dumb SEO Questions

(Entry was posted by Kenneth Villegas on this post in the Dumb SEO Questions community on Facebook, 07/18/2013).

I have a question regarding the robots.txt file.

Hi guys! I have a question regading robots.txt file. I have some issues with my robots.txt file in Google Webmaster Tools because it says that "Sitemap contains urls which are blocked by robots.txt." If i tried to add a new sitemap.xml. Actually, we already have it before but I deleted the old URL which is without the www version and I replaced it yesterday with the www version. here`s our robots.txt and site map.. http://bit.ly/11Z43AC http://bit.ly/16KTdKI
This question begins at 03:33:43 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 03:33:43
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Kenneth Villegas: it actually blocked the most important pages. the homepage, about us and blog page.. ;
  • Dan Gramad?: I am not seeing any problems in your robots.txt file. You should wait so google checks again your robots.txt...

    You can find more details about how to build a correct robots.txt file for Wordpress CMS here: ;http://www.askapache.com/seo/updated-robotstxt-wordpress.html
  • Dan Gramad?: Also try to add  ;"Allow: /" ;
  • Collin Davis: Wait for a day or two and then check again. Robots.txt files usually take 24 hours to be updated in Google Webmaster Tools.

    I believe in a day or two you wouldn't have this issue.
  • Jim Munro: Add Disallow: (blank) as the second line but ;consider that the error might be in the rules you are using to select utls for your sitemap. If you have a directory blocked in robots.txt, you should not include that directory in your sitemap.
  • W.E. Jonk: I noticed that you use Cache-Control which is set to two days or 172800 (60*60*24*2). Google does respect the cache control with regard to robots.txt:

    https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt (under Handling HTTP result codes --> Caching)

    Maybe it will be solved after two days. Or set the cache control to a lower level...

View original question in the Dumb SEO Questions community on Facebook, 07/18/2013).

All Questions in this Hangout