Travis Bailey: Vodka: The cause of, and solution to, all problems.
Jim Munro: If Yandex loves you, surely that`s a good thing, Neil? If bot requests concern you then consider that this probably also means that you might not be ready to handle any increase in readership traffic which might come your way. Upgrading to a suitable hosting level is usually the better response.
Neil Cheesman: I am with Siteground - 3 core and yet started to get some 503 errors... (Yandex was highlighted by Siteground as a likely cause) I have not blocked but added a crawl delay to Yandex
Jim Munro: The server is saying, "I should be able to handle this but my queues are full, please come back later". CPU, memory, disk, quota-limiting, all could be a factor but the point is that your current plan settings have run you out of resources. I think you should be considering a new plan or a new host.
Neil Cheesman: Yes, I realise that - but not quite sure why/how with (CPU: 3Cores, Storage: 50GB, RAM: 6GB)
Jim Munro: Your site is fast. There`s no indication that it`s under load.
Jim Munro: RE: (CPU: 3Cores, Storage: 50GB, RAM: 6GB) Thing is you are on shared hosting and if the other sites that share the server you are on have peaks at the same time as you do, somethings got to give. I`d consider spending the extra £26.00/month to go to the next level up for a while and see what happens.
George G.: SG is known for not handling spikes in traffic as well. even their vps plans are not vps but a form of shared hosting.
George G.: if you insist on staying with SG, i would check Kyup https://kyup.com/ (its their syster company that deals with vps)
Neil Cheesman: George G. Generally, I would say we don`t get more than 30+ visitors at a time...
Neil Cheesman: currently 10 - obviously depends how long they stay etc...
George G.: one of my sites is with similar stats. $20 2 core vps from Vultr set up with serverpilot handles that well enough.
Neil Cheesman: "Cloud hosting, not a shared hosting plan. This means that the resources of your hosting account are dedicated to your websites only and not shared with other users"
Jim Munro: I think that`s just semantics. Unless it`s a dedicated server and sometimes not even then, your site will be sharing a server with other sites.
Travis Bailey: Howdy Cheese Man! (please don`t take it personally, it`s just my goofy sense of humor) Depends on the nature of the content. I generally don`t block much of anything, as it turns into a not-so-fun game of whack a mole. But if the traffic is good, measured by any metric you`ve set out by now - then it`s good traffic. If you`re getting solid returns from Yandex, I`m not really sure where you would stop. Unless you had some personal reasons not to do so.
Dal Tavernor: Add wordfence and set up that it limits requests per minute per ip. That should help slow it down
Travis Bailey: Are people getting some really crazy traffic from Yandex? All one needs to do is set up a segment by hostname, to suss out their traffic.
Jim Munro: Not sure but I reckon blocking googlebot even partially might be counter-productive, Dal Tavernor
Micah Fisher-Kirshner: I`ve done this for both Yandex and Baidu in the past - sometimes you have to take the value (or lack thereof) of your business to rank in those engines/countries and determine if their bots hammer your servers is an issue warranting blocking them.... often it`s a yes.
Eric Wu: I’m with Micah. Our business is primarily US and while we get orders internationally, the cost of serving other countries and their respective bots usually results in a negative ROI.We also find that many international consumers are using Google when discovering us.So for international traffic we throttle the servers to be fast enough for users but push back or ban bots.