"Slurp" is the name of Yahoo's webspider.
"Slurp" is also what it does to websites bandwidth, sometimes nailing sites 12 times a day for no apparent reason.
For instance, here at 4-ch its completely banned. Why? Last month it made more requests than Opera, or by Googlebot & MSNbot combined.
So that fixes the problem? No. At the time of posting, Slurp has made over 1400 pointless requests to robots.txt, the only file its now allowed to touch. That's not huge (way less than 2Mb), but 1400 requests? No wonder my requests figures have been a bit high.
http://www.robotstxt.org/
http://www.ysearchblog.com/archives/000078.html
Bandwidth saver:
Make a link to a script on every page. Make the link display:none. Forbid access to the script in robots.txt. Make the script ban everybody who touches it.
There are a LOT of badly behaved spiders out there, running on zombie machines. This will ban them. It will also ban badly-behaved website downloaders, which you may or may not want to do.
Bandwidth-saver:
http://help.yahoo.com/help/us/ysearch/slurp/slurp-03.html