Why exactly does the host allow these kind of requests from bots? Couldn't they just ignore them (selectively) instead of blocking everyone?
I mean this is not really a good solution on their part - and surely it could happen to any account they have.
edit: Also I don't think problems this frequent and devastating could possibly be normal crawlers.
They are more likely fake:
http://stopmalvertising.com/security/fak...-bots.html
If you've only set the robots.txt to disallow them then the fake ones (probably causing the trouble) will surely ignore that.
I mean this is not really a good solution on their part - and surely it could happen to any account they have.
edit: Also I don't think problems this frequent and devastating could possibly be normal crawlers.
They are more likely fake:
http://stopmalvertising.com/security/fak...-bots.html
If you've only set the robots.txt to disallow them then the fake ones (probably causing the trouble) will surely ignore that.
favorite dcing techniques: wpoint | double key inputs | holding back | alternate basic moves