Dumb SEO Questions

(Entry was posted by Neil Cheesma on this post in the Dumb SEO Questions community on Facebook, 05/31/2018).

Should Yandex bots be disallowed via Robots.txt?

Should Yandex bots be disallowed? (via Robots.txt)
This question begins at 00:39:31 into the clip. Did this video clip play correctly? Watch this question on YouTube commencing at 00:39:31
Video would not load
I see YouTube error message
I see static
Video clip did not start at this question

YOUR ANSWERS

Selected answers from the Dumb SEO Questions Facebook & G+ community.

  • Alan Bleiweiss: Yes, but only if you don`t want to be indexed in Yandex.
  • Ammon Johns: If you cannot serve, and are not interested in links or recommendations from, users in regions where Yandex is a major search engine, feel free to block their spider.
  • Neil Cheesman: Their bots seem to be hitting quite heavy - 0.14% of real users from Russia over the past year...
  • Ammon Johns: They come by often to see if you blocked them yet. :D
  • Neil Cheesman: perhaps better to slow them down... that way they won`t come back so often :)
  • Neil Cheesman: Would this work?User-agent: YandexCrawl-delay: 2 # specifies a 2 second timeout
  • Ammon Johns: Neil Cheesman In Russia, websites do not slow bots. Bots slow websites, Comrade. :D
  • Neil Cheesman: I keep getting emails from the Eastern Bloc GDPR and I keep saying okay but they keep asking.. same thing I guess...
  • Ammon Johns: They gave us Vodka - it`s a fair trade
  • Travis Bailey: Vodka: The cause of, and solution to, all problems.
  • Jim Munro: If Yandex loves you, surely that`s a good thing, Neil? If bot requests concern you then consider that this probably also means that you might not be ready to handle any increase in readership traffic which might come your way. Upgrading to a suitable hosting level is usually the better response.
  • Neil Cheesman: I am with Siteground - 3 core and yet started to get some 503 errors... (Yandex was highlighted by Siteground as a likely cause) I have not blocked but added a crawl delay to Yandex
  • Neil Cheesman: Is it the CPU usage that causes 503 errors?
  • Jim Munro: The server is saying, "I should be able to handle this but my queues are full, please come back later". CPU, memory, disk, quota-limiting, all could be a factor but the point is that your current plan settings have run you out of resources. I think you should be considering a new plan or a new host.
  • Neil Cheesman: Yes, I realise that - but not quite sure why/how with (CPU: 3Cores, Storage: 50GB, RAM: 6GB)
  • Neil Cheesman: ps - am still looking into possible causes...
  • Jim Munro: Your site is fast. There`s no indication that it`s under load.
  • Jim Munro: RE: (CPU: 3Cores, Storage: 50GB, RAM: 6GB) Thing is you are on shared hosting and if the other sites that share the server you are on have peaks at the same time as you do, somethings got to give. I`d consider spending the extra £26.00/month to go to the next level up for a while and see what happens.
  • George G.: SG is known for not handling spikes in traffic as well. even their vps plans are not vps but a form of shared hosting.
  • George G.: if you insist on staying with SG, i would check Kyup https://kyup.com/ (its their syster company that deals with vps)
  • Neil Cheesman: George G. Generally, I would say we don`t get more than 30+ visitors at a time...
  • Neil Cheesman: currently 10 - obviously depends how long they stay etc...
  • George G.: one of my sites is with similar stats. $20 2 core vps from Vultr set up with serverpilot handles that well enough.
  • Neil Cheesman: "Cloud hosting, not a shared hosting plan. This means that the resources of your hosting account are dedicated to your websites only and not shared with other users"
  • Jim Munro: I think that`s just semantics. Unless it`s a dedicated server and sometimes not even then, your site will be sharing a server with other sites.
  • Travis Bailey: Howdy Cheese Man! (please don`t take it personally, it`s just my goofy sense of humor) Depends on the nature of the content. I generally don`t block much of anything, as it turns into a not-so-fun game of whack a mole. But if the traffic is good, measured by any metric you`ve set out by now - then it`s good traffic. If you`re getting solid returns from Yandex, I`m not really sure where you would stop. Unless you had some personal reasons not to do so.
  • Dal Tavernor: Add wordfence and set up that it limits requests per minute per ip. That should help slow it down
  • Travis Bailey: Are people getting some really crazy traffic from Yandex? All one needs to do is set up a segment by hostname, to suss out their traffic.
  • Jim Munro: Not sure but I reckon blocking googlebot even partially might be counter-productive, Dal Tavernor
  • Dal Tavernor: Unlimited for google bot
  • Micah Fisher-Kirshner: I`ve done this for both Yandex and Baidu in the past - sometimes you have to take the value (or lack thereof) of your business to rank in those engines/countries and determine if their bots hammer your servers is an issue warranting blocking them.... often it`s a yes.
  • Eric Wu: I’m with Micah. Our business is primarily US and while we get orders internationally, the cost of serving other countries and their respective bots usually results in a negative ROI.We also find that many international consumers are using Google when discovering us.So for international traffic we throttle the servers to be fast enough for users but push back or ban bots.

View original question in the Dumb SEO Questions community on Facebook, 05/31/2018).