Dumb SEO Questions

(Entry was posted by Casey Markee on this post in the Dumb SEO Questions community on Facebook, 05/10/2017).

Hide Ads from Bot Traffic

I`ve got a cloaking question. My thanks in advance: I have a corporate client who wants to kill the ad server when a spider user agent hits the site. Officially it will be because their advertisers said they didn`t want to pay for bot traffic, so in turn they want to eliminate all bot traffic. A secondary benefit is that the spiders won`t see the ads. Has anyone seen anyone try this and fail...specifically get de-indexed? The client runs their own internal ad network and it`s clear to me that they got SLAMMED recently by either/or Fred and the Core Update in Feb/March. To them, this is their way of keeping the ads and still trying to recover back up to previous levels. cc: Alan, Jenny, Arsen, Roger, Doc.
This question begins at 00:10:08 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:10:08
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Loren Baker: Happy Birthday!
  • Arsen Rabinovich: Happy Day of Birthing Casey! Why do you you think they got slammed by an algorithm? Too many ads/ad placement?
  • Roger Montti: And btw, I`ve been blocking ads from bots for years and it hasn`t affected my rankings. Not saying that I`m recommending that, but my site gets close to a million visitors per month and ranks fine.
  • Arsen Rabinovich: Also, I didn`t know crawlers/bots follow "click through" ads. How are the ads coded?
  • Adam John Humphreys: You should ask Rob Adler about this. He would more than likely know.
  • Roger Montti: The slam *might* be related to content clarity and not about ads. Check their SERPs, do you see ads on the winners?
  • Simon Heseltine: We use angular, the prerendered pages don`t have ads on them
  • Doc Sheldon: I`ve never seen it tried, Casey, but I`m inclined to agree with Roger, that the SEs will probably still see them. Given your client`s present situation, tempting fate might not be in their best interests.
  • Jenny Halasz: So Google regularly crawls with user agents other than gbot just to see this sort of behavior. Technically what you are talking about is cloaking. However, if you block ALL bots and not just gbot, you will probably be fine. The main thing is that the behavior is the same for gbot as everyone else. But this isn`t going to help you if you`re being dinged for an ad heavy site because Google will still see it somehow.
  • Alan Bleiweiss: FUN FACT! Even though Google CLAIMS that you can block anything with the robots.txt file or other methods, they STILL "see" anything you`ve blocked in that file, and are known to consider what they find as part of the algorithmic process. This is a stated fact direct from a Google manager on a call I had with said person a while back. Also of note, if you have done extensive audit work, there are cases where they even ignore the robots.txt file entirely for indexing purposes if "other signals" indicate formulaically that something "should" be indexed. It`s pretty insane that they do this, however they do it nonetheless. As for ads and scripts to "hide" them, many of my clients do this (mostly in the robots.txt file) and I have yet to see it cause a problem. That doesn`t mean it WON`T cause a problem, only that I have not seen evidence that it has caused one, so far.
  • Patty Mantaloons: I thought they said robots.txt blocking was only to prevent the crawl from that starting point, but if you have links etc pointing to the page in question, they`d still find it, no?

View original question in the Dumb SEO Questions community on Facebook, 05/10/2017).