Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
LAst two days downtime
#2
Why exactly does the host allow these kind of requests from bots? Couldn't they just ignore them (selectively) instead of blocking everyone?

I mean this is not really a good solution on their part - and surely it could happen to any account they have.

edit: Also I don't think problems this frequent and devastating could possibly be normal crawlers.
They are more likely fake:
http://stopmalvertising.com/security/fak...-bots.html
If you've only set the robots.txt to disallow them then the fake ones (probably causing the trouble) will surely ignore that.
Reply
Thanks given by:


Messages In This Thread
LAst two days downtime - by MH-Razen - 04-24-2013, 07:02 PM
RE: LAst two days downtime - by YinYin - 04-24-2013, 07:06 PM
RE: Last two days downtime - by Shane - 04-24-2013, 07:43 PM
RE: LAst two days downtime - by YinYin - 04-24-2013, 08:32 PM



Users browsing this thread: 1 Guest(s)