Dumb SEO Questions

(Entry was posted by W.E. Jonk on this post in the Dumb SEO Questions community on Facebook, 12/31/2013).

A pengiun recovery

A pengiun recovery

They did a lot. But what I found surprising was that they block visits from "spammy" sites and serve a 404 as a way to clean up. Personally I am not sure if this would work. And since they also removed/disavow links I am not sure if one can make a statement like that. What are your thoughts about this?

http://www.boxaid.com/word/simple-alternative-google-disavow-tool/   George Dover originally shared to SEO Questions (Ask an SEO Question.):   Wondering if anyone has had luck with using 404s instead of the *Google #disavowlinkstool  .  We did a little experiment on our site and it seems to be working well to get rid of spammy directories pointing to our site.
This question begins at 02:04:37 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 02:04:37
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • W.E. Jonk: A pengiun recovery

    They did a lot. But what I found surprising was that they block visits from "spammy" sites and serve a 404 as a way to clean up. Personally I am not sure if this would work. And since they also removed/disavow links I am not sure if one can make a statement like that. What are your thoughts about this?

    http://www.boxaid.com/word/simple-alternative-google-disavow-tool/
  • George Dover: Wondering if anyone has had luck with using 404s instead of the *Google #disavowlinkstool  ;.  ;We did a little experiment on our site and it seems to be working well to get rid of spammy directories pointing to our site.  ;

    boxaid(dot)com/word/simple-alternative-google-disavow-tool/
  • Tony McCreath: I think it's quite clever. Especially because they identified the dodgy directories and only 404ed to them.
  • W.E. Jonk: But on the other hand if Googlebot is crawling through the link it would get a 404 where as if it would crawl through the Sitemap it would get a 200. Wouldn't that give mixed signals?
  • Matt Raynes: Would the mixed signal issue fall at the feet of the directory as opposed to where the link is arriving?
  • Tony McCreath: I think they were detecting the source IP and only 404ing if it was from a server they have associated with the directories they want to block. Kind of cloaking but targeting the directories and not googlebot. So The bots would still see the 200.

    A simpler way would be to just make all pages return the true content but a 404 for a period. Maybe do it in sections. Once the directories have removed the link you could revert back to a 200 and let the bots come back.
  • W.E. Jonk: +Matt Raynes ;I think I miss-read. Only the directory crawler gets a 404:

    So when the directory software does it’s header crawl it will most likely use this IP address.

View original question in the Dumb SEO Questions community on Facebook, 12/31/2013).

All Questions in this Hangout