Stable diffusion only needs 4GB of VRAM to run on the low end so you can rent low-end consumer GPUs (nvidia RTX for example) for around $0.10 an hour to do the renders.
Yeah, but that's only true when you use one model for yourself. More VRAM is needed for running such a service. It currently loads 6 models per single GPU. And I think I have some VRAM left to add even more.