Submitted by Just0by t3_z9q0pq in MachineLearning
Deep-Station-1746 t1_iyi0y4e wrote
> whether it is PCIe 40GB or SXM 80GB
Oh thank god SXM 80GB is supported! I have way too many A100 80GBs just lying around the house, this will help me find some use for them. /s
Also, I might be stretching this a bit, but uh, do you guys happen to also have an under-8GB VRAM model lying around? :)
SnooWalruses3638 t1_iykhjwn wrote
The improvement approach by OneFlow stable diffusion indeed works on low end consumer card.
plocco-tocco t1_iylno87 wrote
I thought that it was possible to load SD using around 1 GB of VRAM right?
Viewing a single comment thread. View all comments