Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Stable diffusion only needs 4GB of VRAM to run on the low end so you can rent low-end consumer GPUs (nvidia RTX for example) for around $0.10 an hour to do the renders.


Yeah, but that's only true when you use one model for yourself. More VRAM is needed for running such a service. It currently loads 6 models per single GPU. And I think I have some VRAM left to add even more.


I was running it for free on google colab here and there with no problems.

Haven’t tried the 2.0 out yet so not sure how that’s going.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: