Time for a 'bot motel. The simplest way to do that is to salt your site with links which 404 and rewrite the 404 handler to serve garbage pages with more links which also 404. It can be the simplest way to solve compute resource problems. For bandwidth, you need to do some kind of tarpit. You may also need to limit simultaneous conns from e.g. specific addresses.
I haven't done it specifically for images. I imagine a tool like this is crawling pages in order to find image links, although it could be using e.g. search engine results for this. Corrupting the data is also an option.
It's a blunt tool but if people don't honor robots.txt I don't need to concern myself with their pearl-clutching notion of morality. (I don't feel honor bound to list every possible evasion in robots.txt anyway.)
I haven't done it specifically for images. I imagine a tool like this is crawling pages in order to find image links, although it could be using e.g. search engine results for this. Corrupting the data is also an option.
It's a blunt tool but if people don't honor robots.txt I don't need to concern myself with their pearl-clutching notion of morality. (I don't feel honor bound to list every possible evasion in robots.txt anyway.)